∑ ∑ - of Didier Le Ruyet

behind this paper has been to describe RSC codes and turbo codes with conventional Tanner graph (consisting of binary variable nodes and parity check ...
42KB taille 1 téléchargements 50 vues
Different Representations for Recursive Systematic Convolutional Codes and Their Associated Decoding for Turbo Codes Hong SUN*, Zhong-jie LI*, and Didier LE RUYET** *

Huazhong University of Science and Technology (HUST), Department of Electronic and Information Engineering, 430074 Wuhan, China Email: [email protected] ** Conservatoire National des Arts et Métiers (CNAM), Laboratoire Signaux et Systèmes 75141 Paris Cedex 03, France Email: [email protected]

Abstract – The representations of recursive systematic convolutional (RSC) codes based on parity-check equations can be flexibly transformed into different forms with or without auxiliary variable. The equations can be directly realized in conventional Tanner graph with cycles. The compound cross-section (CCS) graph, which is a cycle-free conventional Tanner graph by introducing auxiliary edges is proposed for different structures of RSC codes. We show that the sum product algorithm on the CCS graph is equivalent to the BCJR algorithm. Finally we give some turbo codes schemes with conventional Tanner graph and their associated decoding algorithm are discussed. Keywords convolutional codes, belief propagation, iterative decoding, Tanner graphs, Turbo codes

1. INTRODUCTION Recently, different researchers have studied the turbo decoding algorithm using graphical models. They have also shown the connection between the turbo decoding algorithm as originally expounded by Berrou et al. [1] and the Pearl’s belief propagation algorithm. The goals behind this paper has been to describe RSC codes and turbo codes with conventional Tanner graph (consisting of binary variable nodes and parity check function nodes without trellis section) and, accordingly, to derive low-complexity decoding algorithms. In [2], Kschichang et al. have introduced graphical models called factor graphs that include bayesian networks and Tanner graphs and presented rules to compute marginal probabilities in factor graphs. The concept of factor graphs and these rules are the basis of our work in this paper. The turbo-decoding algorithm is described in Bayesian networks by McEliece and MacKay et al. [3] and by Kschichang and Frey et al. [4], and also described in generic Tanner graph by Wiberg [5]. In these models, auxiliary variables (or hidden variables) which are not binary and local functions given in terms of trellis are introduced, in order that the constituent codes could be described in a cycle-free graph. In this

paper, we search for code descriptions based on conventional Tanner graph in which all variables are binary and each local function is a parity-check function. Conventional Tanner graphs are directly deduced from parity-check equations which could be flexibly transformed to other forms. We can then derive low complexity decoding algorithms in conventional Tanner graphs. These algorithms will provide new efficient hardware architecture for turbo decoding. In this paper, we derive, in section 2, different descriptions of RSC codes. Then, in section 3, we propose a cycle free conventional Tanner graph for the different representations and apply the sum product algorithm on this graph. Finally, in section 4, we present some turbo codes schemes with conventional Tanner graph and their corresponding decoding algorithm.

2. REPRESENTATIONS OF RECURSIVE CONVOLUTIONAL CODES A recursive convolutional code is defined with the generator G(D)=Q(D)/P(D), where D is the unit-delay operator, Q(D) and P(D) are polynomials of degree m over the finite field GF(2). The input-output equation of the encoder with input U(D) and output X(D) is written as

Q( D) U ( D) P( D )

X ( D) =

(1)

2.1 Equations without Auxiliary Variable The equation above can be interpreted to the following forms (2) and (3) without auxiliary variable for RSC code [U(D) X(D)]. From (1), we directly have Q( D)U ( D) + P ( D) X ( D ) = 0 (2a) and (2a) means the following parity-check equation (2b) for all k: m

m

∑ q u (k − i) + ∑ p x(k − i) = 0 i =0

i

i =0

i

(2b)

When P(D) is a primitive polynomial, there exists a

2 m −1 − m so that m P( D) M ( D) = 1 + D 2 −1

y1p

y1s

y2p

y2s

y3p

y3s

x1

u1

x2

u2

x3

u3

polynomial M(D) of degree

Accordingly, (1) can be rewritten to be (3):

H ( D)U ( D) + (1 + D 2

m

−1

+

) X ( D) = 0

(3a) where H(D)=Q(D)M(D), and the corresponding paritycheck equation for all k is 2 m −1

∑ h u ( k − i ) + [ x( k ) + x ( k − 2 i =0

2.2

i

m

1

s−1

+ 1)] = 0 (3b)

+

1′

+2 +

s0

2′

+3 +

s1

s2

3′

s3

Figure 1: Tanner graph of equation (5) for (7,5) code

Equations with Auxiliary Variables

In order to obtain a realization of the RSC codes in a cycle-free graph, non binary auxiliary variables are generally introduced. In this paper, we will only use binary auxiliary variable s k to get a conventional Tanner graph. Now, let

S ( D ) = U ( D ) / P( D ) = ∑ s ( k ) D k , s ( k ) ∈ GF ( 2) k

Then, (1) becomes

U ( D) + X ( D) = [ P( D) + Q( D)]S ( D)  U ( D) = P( D) S ( D)  U ( D ) + X ( D ) = [ P ( D ) + Q ( D )]S ( D )  X ( D) = Q( D) S ( D)  ∂ ( A) of a polynomial

or

3.1 Compound Cross-section Graph and Hidden Edges In [2], the rules to obtain a cross section from a factor graph are given by: 1. replacing each local function having an argument with its corresponding cross-section; and 2. omitting variable nodes and any edges incident on them. An example for the rule is shown in Fig.2.

(4a)

a( x1 , x2 ) b( x2 , x3 )

(4b)

x1

x2

x3

a( x1 x2 = a2 ) b( x3 x2 = a2 )

x1

x3

Figure 2: Factor graph for the cross-section (Kschischang and Frey et al. [2])

A( D ) = ∑ a (k ) D , a (k ) ∈ GF (2) k

k

denotes the number of a(k)=1. We use (4a) when ∂ ( P ) < ∂ (Q ) , and use (4b) when ∂ (Q ) < ∂ ( P ) .

3. RSC DECODING IN TANNER GRAPH The decoding of RSC codes algorithm can be exactly derived in the conventional Tanner graph using equation (4) with auxiliary variables. To illustrate this idea, we take the (7,5) code as an example. Since ∂(7) > ∂(5) , where

(7 ) = 1 + D + D 2

and (5) = 1 + D , (4b) should be used. And we have P+Q=(7)+(5)=(2), then (4b) can be written as 2

u (k ) + x(k ) = s (k − 1) , u , x, s ∈ GF (2)   x(k ) = s (k ) + s (k − 2) Fig.1 shows the Tanner graph of the above equations with noisy received symbol y and parity-check function ⊕ . This graph has cycles. We will employ different cross-sections in order to eliminate the cycles.

We have noted that the local function in conventional Tanner graph is a simple parity-check, thus the local function having an argument would be an even or odd parity-check, which is equivalent to the function connected the assignment value 0 or 1. Of course, every variable assignment yields a cross-section graph, however, there are only 2 possibilities of assignment value (0 or 1) for each variable. The two evidences above allow us to put all cross-section graphs together in one conventional Tanner graph shown in Fig.3 by introducing auxiliary edges or hidden edges (dashed in Fig.3) to pass message of the assignment for the cross-section. We call this type of graph compound cross-section (CCS) graph, in which the message of conditional probability and that of the condition (assignment value) are simultaneously propagated through edges and auxiliary edges respectively. Since the a priori information about any hidden variable s k is zero, or say, the a priori probability p a is

p a ( s k ) [ p a ( s k = 0), p a ( s k = 1)] = [0.5,0.5] ,

we have

p ( s k s j ) = p ( s j s k ) . In fact, p( s j s k ) is

the projection of

 p( s k ) ( s j = 0) [ p1 , p 2 ]   p( s k ) ( s j = 1) [ p3 , p 4 ] and p ( s k s j ) is the projection of p ( s k , s j ) on the coordinate

and

This means that there exist hidden edges between

sk

[ p ( s k s j )] = [ p ( s j s k )]T .

α k −2

to

α k −1

α k −1 to

αˆ  αˆ

=α =α

ek0 = [(α k0−1 ⊕ k β k0 ) ⊗ x k ] ⊕ k ′ 0  1 1 1  ek = [(α k −1 ⊕ k β k ) ⊗ xk ] ⊕ k ′ 1 ek = ek0 + e1k = [ p1e + p3e , p 2e + p 4e ]

and

exact where

α k (S )

and

β k (S )

αk

and

βk

in Fig.3 are

of the BCJR algorithm,

α k ( S ) = p ( S k = S , Y1k ) = ∑α k −1 ( S ′)γ k ( S ′, S ) S′

β k ( S ′) = p (Y

N k +1

S k = S ′) = ∑ β k +1 ( S )γ k +1 ( S ′, S ) S

and

γ k (S ′, S ) = p(S k = S , y k S k −1 = S ′)

s k ∈ GF (2) in Tanner graph of Fig.3 while

⊕ k −1 [ x k −1 ⊗ (u k −1 ⊕ ( k −1)′ 0)]

S k ∈ GF (2 m ) in the BCJR algorithm. And then, ek

⊕ k −1 [ x k −1 ⊗ (u k −1 ⊕ ( k −1)′ 1)]

in Fig.3 is exactly the extrinsic information for

The second step is, through the hidden edge

{s k −1 , s ′k −2 } , to pass from αˆ k −1 to α k −1 :

α k0−1  αˆ k0−1   1 = 1  α k −1  αˆ k −1 

T

The same procedure occurs in the chain of message passing from-right-to-left (or backward) of pb . β s in Fig.3 express

We find:

p f ( s k −2 ) ( s k −1 ) ⊕ k −1 , to pass from

T

e1k [ p3e (u k = 0), p 4e (u k = 1)] ( s k −1 = 1)

Note that 0 k −2 1 k −2

⊕ k +1 , we have:

ek0 [ p1e (u k = 0), p 2e (u k = 1)] ( s k −1 = 0)

p f ( s k −1 ) ( s k −2 )

αˆ k −1 :

0 k −1 1 k −1

pb ( s k ) ( s k −1 )

It is easy to prove that

p f (sk−3 ) (sk −2 = 0) p f (sk−3 ) (sk−2 = 1)

The first step is, through check node

α k −2

βk

We refer to the soft-output, or the conditional a posteriori probability for u k given Y as ek (bold arrow in Fig.3):

in Fig.3, where

α k0−2  1 α k −2 αˆ k −1 and

pb ( s k −1 ) ( s k )

 β k0   βˆ k0   1  =  ˆ1  βk  βk 

3.2 Sum-product Algorithm in CCS Graph Now, an exact solution is given with sum-product ( ⊕,⊗ ) algorithm following the forward/backward schedule in Fig.3. In the chain of message passing from-left-to-right (or forward) of p f , we have two steps to pass from

pb ( s k +1 ) ( s k = 1)

 βˆ k0 = β k0+1 ⊕ k +1 [ x k +1 ⊗ (u k +1 ⊕ ( k +1)′ 0)]  ˆ1 1  β k = β k +1 ⊕ k +1 [ x k +1 ⊗ (u k +1 ⊕ ( k +1)′ 1)] and through hidden edge {s ′k −1 , s k } , we have:

s j in a CCS graph, which are represented by bold

dashed in Fig.3. Note that no computation is needed to send message through these hidden edges except a rearrangement . When m=2, this rearrangement is a simple transposition :

pb ( s k +1 ) ( s k = 0)

k

Through check node

sj :

 p ( s j ) ( s k = 0) = [ p1 , p 3 ]   p ( s j ) ( s k = 1) = [ p 2 , p 4 ] and

 β k0+1  1  β k +1 βˆ

p ( s k , s j ) on the coordinate s k :

u k in

the BCJR algorithm. It should be noticed that the number of multiplication for computing ek in Fig.3 is half of that in the BCJR algorithm. More important is that Fig.3 expose the message passing procedure of the BCJR algorithm in a conventional Tanner graph. The above algorithm can be used for the decoding of turbo codes. The generalization to other RSC codes is straightforward.

with cycles. We have also shown how compound cross-section (CCS) can be used to represent RSC codes in a cycle-free conventional Tanner graph. We believe that CCS graph models will provide low-complexity decoding algorithms for turbo codes.

4. DIFFERENT CCS GRAPH MODELS From parity-check equation (2) without auxiliary variable, the corresponding CCS graph is shown in Fig.4 for (7,5) code. The conditional probability

p(u k x k ) with its a priori probability is propagated in

Acknowledgments: The work is supported in part by the NSF of China and under Cooperative Agreement between H. Vu Thien at CNAM and H. Sun at HUST.

the graph. Note that the graph is composed of two independent trees : the first one comprises odd check nodes and the second one even check nodes. The corresponding CCS graph from equation (3) for (7,5) code has three independent trees. One of them is shown in Fig.5, in which also the conditional probability

REFERENCES [1] C. Berrou, A. Glavieux and P. Thitimajshima, “Near Shanon-limit error-correcting coding and decoding: turbo codes”, Proc. ICC’93, pp.1064-1070, Geneva, May 1993 [2] F. R. Kschischang, B. J. Frey and H-A. Loeliger, “Factor graphs and the sum-product algorithm”, submitted to IEEE Trans. on Info [3] R. J. McEliece, D. J. C. MacKay and J-F. Cheng, “Turbo decoding as an instance of Pearl’s ‘belief propagation’ algorithm”, IEEE Journal on selected areas in Com., Vol.16, pp.140-152, Feb. 1998 [4] F. R. Kschischang and B. J. Frey, “Iterative decoding of compound codes by probability propagation in graphical models”, - pp.219-230 [5] N. Wiberg, H-A. Loeliger and R. Kotter, “Codes and iterative decoding on generic graphs”, Eur. Trans. Telecommun., Vol.6, pp.513-525, Sept./Oct. 1995

p(u k x k ) is propagated. With a suitable puncturing scheme, this graph allows us use the information of all observed symbols x k ( k = 1,4,7, L 3i + 1, L; ∀i , for example) and the punctured symbols are not required for decoding. For m=2, the puncturer’s period is equal to 3, and the graphs for decoding of each constituent codes are the same as Fig.5.

5. CONCLUSION We have presented different representations for recursive systematic convolutional (RSC) codes, which can be realized directly by conventional Tanner graph

y kp− 1 x k −1

y ks − 1

y kp

y ks

y kp+ 1

u k −1

xk

uk

x k +1

y ks + 1 u k +1

ek

+

k −1

+

( k − 1) ′

α k−2

αˆ k − 1

α k −1

s ′k − 3

s k −1

s ′k − 2

+

k

βk

sk

+

+

k′

k +1

βˆ k

β k +1

s ′k − 1

s k +1

+

( k + 1) ′

s k′

Figure 3: Compound cross-section graph for Tanner graph of Fig.1 y ks −1

y ks

u k −1

uk

+

k −1

xk′ − 2 xk −1 xk

+

y ks +1

u1

u k +1

k

+

+

k +1

x′k −1 xk xk +1 xk′ xk +1 xk + 2

Figure 4: CCS graph of (2)

u2

x1′

u3

u4

u5

+

k −1

x4

u6

x′4

Figure 5: CCS graph of (3)

k

x7

u7