0

0

1888 days ago,
771 views

PowerPoint PPT Presentation
Conviction Propagation with Information Correction: Near Maximum-Likelihood Decoding of LDPC Codes Ned Varnica + , Marc Fossorier # , Alek Kav čić + Division of Engineering and Applied Sciences Harvard University # Department of Electrical Engineering University of Hawaii

Outline Motivation – BP versus ML disentangling Improved iterative decoder of LDPC codes Types of BP deciphering mistakes Simulation comes about

. . . . . . LDPC Code Graph Parity check network H Bipartite Tanner code chart G = (V,E,C) Variable (image) hubs v i V, i = 0, 1, … , N-1 Parity check hubs c j C, j = 0 , 1, … , N c - 1 N c x N A non-zero passage in H an edge in G Code rate R = k/N , k N-N c Belief Propagation Iterative proliferation of contingent probabilities

Standard Belief-Propagation on LDPC Codes Locally working ideal for sans cycle diagrams Optimized LDPC codes (Luby et al 98, Richardson, Shokrollahi & Urbanke 99, Hou, Siegel & Milstein 01, Varnica & Kavcic 02) problematic for charts with cycles Good limited LDPC have an exponential number of cycles in their Tanner diagrams (Etzion, Trachtenberg and Vardy 99) Encoder developments BP to ML execution hole because of union to pseudo-codewords (Wiberg 95, Forney et al 01, Koetter & Vontobel 03)

Examples Short Codes - e.g. Leather expert code with N = 155 , k = 64, diam = 6 , size = 8, d min = 20 Long Codes - e.g. Margulis Code with N = 2640 k = 1320

Goals Construct decoder Improved BP unraveling execution More adaptability in execution versus multifaceted nature Can about accomplish ML execution with much lower computational weight Reduce or take out LDPC mistake floors Applications Can use with any "off-the-rack" LDPC encoder Can apply to any correspondence/information stockpiling channel

Subgraph Definitions ^ x (L) {0,1} N Definition 1: SUC chart G S (L) = ( V S (L) , E S (L) , C S (L) ) is diagram incited by SUC C S (L) r R N x {0,1} N BCJR locator BP decoder channel transmitted parallel vector got vector decoded vector after L emphasess ^ Syndrome s = H x (L) C S (L) - Set of unsatisfied check hubs (SUC) C S (L) = { c i : ( H x (L) ) i 0 } V S (L) - Set of variable hubs occurrence to c C S (L) E S (L) - Set of edges associating V S (L) and C S (L) ^ d Gs (v) - Degree in SUC diagram G S (L) for v V d Gs (v) d G (v)

Properties of SUC diagram Observation 1 : The higher the degree d Gs (v) of a hub v V s (L) the more probable is v to be in blunder e.g. Measurements for Tanner (155,64) code obstructs for which BP flopped on AWGN channel at SNR = 2.5 dB Select v hub Perform data remedy

Node Selection Strategy 1 Strategy 1 : Determine SUC diagram and select the hub with maximal degree d Gs in SUC chart G S (L) Select hub v 0 or v 2 or v 12

Properties of SUC diagram, cntd Definition 2: Nodes v 1 and v 2 are neighbors regarding SUC if there exist c C S (L) occurrence to both v 1 and v 2 C S (L) n v (m) - number of neighbors of v with degree d Gs = m . . . n v (2) = 1 and n v (1) = 4 . . . Perception 2: The littler the quantity of neighbors (wrt to SUC chart) with high degree, the more probable v is to be in mistake

Node Selection Strategy 2 Strategy 2: Among hubs with maximal degree d Gs select a hub with negligible number of most elevated degree neighbors n v0 (2) = n v12 (2) = 1; n v2 (2) = 2 n v0 (1) = 4; n v12 (1) = 6 Select hub v 0

Alternatives to Strategy 2 max d Gs = max d Gs (v) Set of suspicious hubs S v = { v : d Gs (v) = d Gs } Edge punishment work r(v,c) = ( N c - set of v hubs episode to c ) Penalty work R(v) = r(v,c) – r(v,c) Select v p S v as v p = argmin R(v ) Numerous related methodologies conceivable v V max d Gs ( v n ); if N c \ { v } v n N c \{v} 0 ; if N c \ { v } = c C s c C s max v S v

Node Selection Strategy 3 Decoder contribution on hub v i Memoryless AWGN channel: Observation 3: A variable hub v will probably be off base if its decoder information is less solid, i.e., if | O(v) | is lower Strategy 3: Among hubs with maximal degree d Gs select hub with insignificant info dependability | O(v) |

Message Passing - Notation Set of log-probability proportions messages on v hubs: M = (C,O) Decoder input : O = [ O ( v 0 ) , … , O ( v N-1 )] Channel locator (BCJR) input B = [ B ( v 0 ) , … , B ( v N-1 )] . . . . . . C . . . V . . . . . . O . . . . . . T

j = 1 j = 2 j = 3 7 3 8 1 9 4 10 begin 11 5 12 2 13 6 14 Symbol Correction Procedures Replace decoder and indicator input LLRs relating to chose v p O (v p ) = +S and B ( v p ) = +S O (v p ) = –S and B ( v p ) = –S Perform amendment in stages Test 2 j mixes at stage j For every test play out extra K j emphasess Max number of endeavors (stages) j max

Symbol Correction Procedures "codeword posting" approach Test each of the 2 j max potential outcomes W – accumulation of legitimate codeword competitors Pick the in all probability hopeful e.g. for AWGN channel set x = argmin d( r , w ) j = 1 j = 2 j = 3 7 3 8 1 9 ^ 4 w W begin 10 "first codeword" approach Stop at a first legitimate codeword Faster merging, somewhat more regrettable execution for expansive j max 11 5 12 2 13 6 14

j = 1 j = 2 j = 3 j = 1 j = 2 j = 3 7 3 2 8 4 1 9 6 4 5 10 7 begin 11 10 5 9 12 11 2 8 13 12 6 14 Parallel and Serial Implementation ( j max = 3 ) begin

j = 1 j = 2 j = 3 7 3 8 1 9 4 10 begin 11 5 12 2 13 6 14 Complexity - Parallel Implementation Decoding proceeded with M should be put away capacity (2 j max ) bring down K j required "first codeword" methodology - quickest meeting Decoding restarted M require not be put away higher K j required

ML decoder 0 10 "codeword posting" strategy unique BP (max 100 iter) - 1 10 - 2 10 WER - 3 10 - 4 10 - 5 10 0 0.5 1 1.5 2 2.5 3 3.5 4 E/N [dB] b 0 Can we accomplish ML? Certainty 1: As j max N, "codeword posting" calculation with K j = 0, for j < j max , and K jmax = 1 gets to be ML decoder For low estimations of j max (j max << N) performs near ML decoder Tanner ( N = 155, k = 64 ) code j max = 11, K j = 10 Decoding proceeded with speedier interpreting M should be put away ML practically accomplished

Pseudo-codewords Elimination Pseudo-codewords contend with codewords in locally-working BP unraveling (Koetter & Vontobel 2003) c - a codeword in a m - front of G i - division of time v i V accept off base esteem in c = ( 0 , 1 , … , N-1 ) - pseudo-codeword pseudo-separate (for AWGN) Eliminate an expansive number of pseudo-codewords by constraining image " 0 " or image " 1 " on hubs v p Pseudo-remove spectra enhanced Can expand min pseudo-separate if j max is sufficiently substantial

Types of BP deciphering mistakes Very high SNRs (blunder floor district) Stable mistakes on immersed subgraphs: decoder achieves a relentless state and comes up short messages go in SUC chart soaked Definition 3: Decoder D has achieved an enduring state in the interim [ L 1 ,L 2 ] if C s (L) = C s (L 1 ) for all L [L 1 ,L 2 ] 2. Medium SNRs (waterfall locale) Unstable Errors: decoder does not achieve an enduring state

SUC Properties in Error Floor Region Theorem 1: In the mistake floor area Corollary: For general LDPC codes with Information amendment for high SNRs (blunder floor district) Pros: Small size SUC Faster joining Cons: d Gs assumes no part in hub determination

ML decoder 0 "codeword posting" system 10 "first codeword" methodology unique BP (max 100 iter) - 1 10 - 2 10 WER - 3 10 - 4 10 - 5 10 0 0.5 1 1.5 2 2.5 3 3.5 4 E/N [dB] b 0 Simulation Results Tanner (155,64) code Regular (3,5) code Channel: AWGN Strategy 3 j max = 11 , K j = 10 More than 1 dB pick up ML practically accomplished

Simulation Results Tanner (155,64) code Regular (3,5) code Channel: AWGN Strategy 3 "First codeword" technique j max = 4,6,8 and 11 K j = 10

Simulation Results – Error Floors Margulis (2640,1320) code Regular (3,6) code Channel: AWGN Strategy 3 "First codeword" methodology j max = 5, K j = 20 More than 2 requests of sizes WER change

Simulation Results – ISI Channels Tanner (155,64) code Channels: Dicode (1-D) EPR4 (1-D)(1+D) 2 Strategy 2 j max = 11 , K j = 20 1 dB pick up 20 % of identified blunders are ML

Conclusion Information redress in BP unraveling of LDPC codes More adaptability in execution versus multifaceted nature Can about accomplish ML execution with much lower computational weight Eliminates an expansive number of pseudo-codewords Reduces or disposes of LDPC mistake floors Applications Can use for any "off-the-rack" LDPC encoder Can apply to any correspondence/information stockpiling channel

SPONSORS

No comments found.

SPONSORS

SPONSORS