Xem mẫu
- Digital Communications I:
Modulation and Coding Course
Period 3 - 2007
Catharina Logothetis
Lecture 5
- Last time we talked about:
Receiver structure
Impact of AWGN and ISI on the
transmitted signal
Optimum filter to maximize SNR
Matched filter and correlator receiver
Signal space used for detection
Orthogonal N-dimensional space
Signal to waveform transformation and vice
versa
Lecture 5 2
- Today we are going to talk about:
Signal detection in AWGN channels
Minimum distance detector
Maximum likelihood
Average probability of symbol error
Union bound on error probability
Upper bound on error probability based
on the minimum distance
Lecture 5 3
- Detection of signal in AWGN
Detection problem:
Given the observation vector z , perform a
mapping from z to an estimate m of the
ˆ
transmitted symbol, mi , such that the
average probability of error in the decision
is minimized.
n
mi si z ˆ
m
Modulator Decision rule
Lecture 5 4
- Statistics of the observation Vector
AWGN channel model: z = s i + n
Signal vector s i = (ai1 , ai 2 ,..., aiN ) is deterministic.
Elements of noise vector n = (n1 , n2 ,..., nN ) are i.i.d
Gaussian random variables with zero-mean and
variance N 0 / 2 . The noise vector pdf is
1 ⎛ n2⎞
pn (n) = exp⎜ − ⎟
(πN 0 )N / 2 ⎜ N 0 ⎟
⎝ ⎠
The elements of observed vector z = ( z1 , z 2 ,..., z N ) are
independent Gaussian random variables. Its pdf is
1 ⎛ z − si 2 ⎞
pz ( z | s i ) = exp⎜ − ⎟
(πN 0 )N / 2 ⎜ N 0 ⎟
⎝ ⎠
Lecture 5 5
- Detection
Optimum decision rule (maximum a
posteriori probability):
Set m = mi if
ˆ
Pr( mi sent | z ) ≥ Pr(mk sent | z ), for all k ≠ i
where k = 1,..., M .
Applying Bayes’ rule gives:
Set m = mi if
ˆ
pz (z | mk )
pk , is maximum for all k = i
pz ( z )
Lecture 5 6
- Detection …
Partition the signal space into M decision
regions, Z1 ,..., Z M such that
Vector z lies inside region Z i if
pz (z | mk )
ln[ pk ], is maximum for all k = i.
pz ( z )
That means
m = mi
ˆ
Lecture 5 7
- Detection (ML rule)
For equal probable symbols, the optimum
decision rule (maximum posteriori probability)
is simplified to:
Set m = mi if
ˆ
pz (z | mk ), is maximum for all k = i
or equivalently:
Set m = mi if
ˆ
ln[ pz (z | mk )], is maximum for all k = i
which is known as maximum likelihood.
Lecture 5 8
- Detection (ML)…
Partition the signal space into M decision
regions, Z1 ,..., Z M.
Restate the maximum likelihood decision
rule as follows:
Vector z lies inside region Z i if
ln[ pz (z | mk )], is maximum for all k = i
That means
m = mi
ˆ
Lecture 5 9
- Detection rule (ML)…
It can be simplified to:
Vector z lies inside region Z i if
z − s k , is minimum for all k = i
or equivalently:
Vector r lies inside region Z i if
N
1
∑ z j akj − 2 Ek , is maximum for all k = i
j =1
where Ek is the energy of sk (t ).
Lecture 5 10
- Maximum likelihood detector block
diagram
〈⋅,s1 〉
1
− E1 Choose
z 2 ˆ
m
the largest
〈⋅, s M 〉
1
− EM
2
Lecture 5 11
- Schematic example of ML decision regions
ψ 2 (t )
Z2
s2
Z1
s3 s1
Z3 ψ 1 (t )
s4
Z4
Lecture 5 12
- Average probability of symbol error
Erroneous decision: For the transmitted symbol mi
or equivalently signal vector s i , an error in decision occurs
if the observation vector z does not fall inside region Z i.
Probability of erroneous decision for a transmitted symbol
Pe (mi ) = Pr(m ≠ mi and mi sent)
ˆ
or equivalently
Pr(m ≠ mi ) = Pr(mi sent)Pr(z does not lie inside Z i mi sent)
ˆ
Probability of correct decision for a transmitted symbol
Pr(m = mi ) = Pr(mi sent)Pr(z lies inside Z i mi sent)
ˆ
Pc (mi ) = Pr(z lies inside Z i mi sent) = ∫ p (z | m )dz
z i
Pe (mi ) = 1 − Pc (mi )
Zi
Lecture 5 13
- Av. prob. of symbol error …
Average probability of symbol error :
M
PE ( M ) = ∑ Pr (m ≠ mi )
ˆ
i =1
For equally probable symbols:
M M
1 1
PE ( M ) =
M
∑ Pe (mi ) = 1 − M
i =1
∑ P (m )
i =1
c i
M
1
= 1−
M
∑ ∫ p (z | m )dz
i =1 Z i
z i
Lecture 5 14
- Example for binary PAM
pz (z | m2 ) pz (z | m1 )
s2 s1
ψ 1 (t )
− Eb 0 Eb
⎛ s1 − s 2 / 2 ⎞
Pe (m1 ) = Pe (m2 ) = Q⎜ ⎟
⎜ N /2 ⎟
⎝ 0 ⎠
⎛ 2 Eb ⎞
PB = PE (2) = Q⎜
⎜ N
⎟
⎟
⎝ 0 ⎠
Lecture 5 15
- Union bound
Union bound
The probability of a finite union of events is upper bounded
by the sum of the probabilities of the individual events.
Let Aki denote that the observation vector z is closer to
the symbol vector s k than s i , when s i is transmitted.
Pr( Aki ) = P2 (s k , s i ) depends only on s i and s k .
Applying Union bounds yields
M M M
1
Pe (mi ) ≤ ∑ P2 (s k , s i ) PE ( M ) ≤ ∑∑ P (s 2 k , si )
k =1 M i =1 k =1
k ≠i k ≠i
Lecture 5 16
- Example of union bound
ψ2
Z2 r Z1
Pe (m1 ) = ∫ p (r | m )dr
r
Z 2 ∪Z3 ∪Z 4
1 s2 s1
ψ1
Union bound: s3 s4
4 Z3
Pe (m1 ) ≤ ∑ P2 (s k , s1 )
Z4
k =2
ψ2 ψ2 ψ2
A2 r r r
s2 s1 s2 s1 s2 s1
ψ1 ψ1 ψ1
s3 s4 s3 s4 s3 s4
A3 A4
P2 (s 2 , s1 ) = ∫ p (r | m )dr
A2
r 1 P2 (s 3 , s1 ) = ∫ p (r | m )dr
A3
r 1 P2 (s 4 , s1 ) = ∫ p (r | m )dr
r 1
A4
Lecture 5 17
- Upper bound based on minimum distance
P2 (s k , s i ) = Pr(z is closer to s k than s i , when s i is sent)
∞
1 u2 ⎛ d ik / 2 ⎞
= ∫ exp(− )du =Q⎜ ⎟
πN 0 N0 ⎜ N /2 ⎟
d ik ⎝ 0 ⎠
d ik = s i − s k
1 M M ⎛ d min / 2 ⎞
PE ( M ) ≤
M
∑∑ P2 (s k , si ) ≤ (M − 1)Q⎜ N / 2 ⎟
⎜ ⎟
i =1 k =1
k ≠i
⎝ 0 ⎠
d min = min d ik
Minimum distance in the signal space: i ,k
i≠k
Lecture 5 18
- Example of upper bound on av. Symbol
error prob. based on union bound
s i = Ei = Es , i = 1,...,4 ψ 2 (t )
d i ,k = 2 Es
i≠k Es s2
d min = 2 Es d 2,3 d1, 2
s3 s1
ψ 1 (t )
− Es Es
d 3, 4 d1, 4
s4
− Es
Lecture 5 19
- Eb/No figure of merit in digital
communications
SNR or S/N is the average signal power to the
average noise power. SNR should be modified
in terms of bit-energy in DCS, because:
Signals are transmitted within a symbol duration
and hence, are energy signal (zero power).
A merit at bit-level facilitates comparison of
different DCSs transmitting different number of bits
per symbol.
Eb STb S W Rb : Bit rate
= =
N 0 N / W N Rb W : Bandwidth
Lecture 5 20
nguon tai.lieu . vn