Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Definitions  





2 Properties  





3 Example  





4 Lower and upper bound of block codes  



4.1  Family of codes  





4.2  Hamming bound  





4.3  Singleton bound  





4.4  Plotkin bound  





4.5  GilbertVarshamov bound  





4.6  Johnson bound  





4.7  EliasBassalygo bound  







5 Sphere packings and lattices  





6 See also  





7 Notes  





8 References  





9 External links  














Block code: Difference between revisions






Català
Čeština
Deutsch
Español

Italiano
Қазақша
Nederlands

Русский
Українська
Tiếng Vit

 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 




Print/export  



















Appearance
   

 





Help
 

From Wikipedia, the free encyclopedia
 


Browse history interactively
 Previous editNext edit 
Content deleted Content added
Line 103: Line 103:


== References ==

== References ==


{{reflist}}


{{Refimprove|date=September 2008}}

{{Refimprove|date=September 2008}}


* {{cite book | author=J.H. van Lint | authorlink=Jack van Lint | title=Introduction to Coding Theory | edition=2nd ed | publisher=Springer-Verlag | series=[[Graduate Texts in Mathematics|GTM]] | volume=86 | date=1992 | isbn=3-540-54894-7 | page=31}}

* {{cite book | author=J.H. van Lint | authorlink=Jack van Lint | title=Introduction to Coding Theory | edition=2nd ed | publisher=Springer-Verlag | series=[[Graduate Texts in Mathematics|GTM]] | volume=86 | date=1992 | isbn=3-540-54894-7 | page=31}}

* {{cite book | author=F.J. MacWilliams | authorlink=Jessie MacWilliams | coauthors=[[Neil Sloane|N.J.A. Sloane]] | title=The Theory of Error-Correcting Codes | publisher=North-Holland | date=1977 | isbn=0-444-85193-3 | page=35}}

* {{cite book | author=F.J. MacWilliams | authorlink=Jessie MacWilliams | coauthors=[[Neil Sloane|N.J.A. Sloane]] | title=The Theory of Error-Correcting Codes | publisher=North-Holland | date=1977 | isbn=0-444-85193-3 | page=35}}


Revision as of 08:44, 9 March 2011

Incoding theory, block codes are one of the two common types of channel codes (the other one being convolutional codes), which enable reliable transmission of digital data over unreliable communication channels subject to channel noise.

A block code transforms a message m consisting of a sequence of information symbols over an alphabet Σ into a fixed-length sequence cofn encoding symbols, called a code word. In a linear block code, each input message has a fixed length of k < n input symbols. The redundancy added to a message by transforming it into a larger code word enables a receiver to detect and correct errors in a transmitted code word, and – using a suitable decoding algorithm – to recover the original message. The redundancy is described in terms of its information rate, or more simply – for a linear block code – in terms of its code rate, k/n.

The error correction performance of a block code is described by the minimum Hamming distance d between each pair of code words, and is called the distance of the code.

Definitions

The encoder for a block code divides the information sequence into message blocks, each message block contains information symbols over an alphabet set , i.e. a message could be represented as a -tuple . The total number of possible different message is therefore . The encoder transforms message independently onto an -tuple codeword . The code of block length over is a subset of  : the total number of possible different codewords is the same as the total number of messages and is called dimension. Rate of the block code is defined as .

The Hamming weight of a codeword is defined as the number of non-zero positions in . The Hamming distance between two codewords and is defined as the number of different positions between the codewords. The (minimum) distance of a block code is defined as the minimum distance between any two different codewords: . The notation is used to represent a block code of dimension , block length over alphabet set , with distance . In particular, if alphabet set has size , the block code is denoted as .

Properties

A codeword could be considered as a point in the -dimension space and the code is the subset of . A code has distance means that , there is no other codeword in the Hamming ball centered at with radius , which is defined as the collection of -dimension words whose Hamming distanceto is no more than . Similarly, with (minimum) distance has the following properties:

Maximum likelihood decoding (MLD) could be used to decode: it outputs the codeword closest to the received word in Hamming distance means: .

Example

Consider a simple repetition code : . The distance of is 3 because any different valid codewords have at least 3 different symbols. Therefore, if there is only up to 1 error in the received word, e.g. , the error could be recovered. However if there are more than 1 possible errors, the decoder could not tell what is the correct codeword. For example, could be the received word for either or.

Lower and upper bound of block codes

Family of codes

is called family of codes, where is an code with monotonic increasing .

Rate of family of codes is defined as

Relative distance of family of codes is defined as

To explore the relationship between and , a set of lower and upper bounds of block codes are known.

Hamming bound

Singleton bound

, i.e.

Plotkin bound

For ,

For the general case, the following Plotkin bounds holds for any with distance :

1. If

2. If >

For any -ary code with distance ,

Gilbert–Varshamov bound

, where , is the -ary entropy function.

Johnson bound

Define .
Let be the maximum number of codewords in a Hamming ball of radius for any code of distance .

Then we have the Johnson Bound : , if

Elias–Bassalygo bound

Sphere packings and lattices

Block codes are tied to the sphere packing problem which has received some attention over the years. In two dimensions, it is easy to visualize. Take a bunch of pennies flat on the table and push them together. The result is a hexagon pattern like a bee's nest. But block codes rely on more dimensions which cannot easily be visualized. The powerful Golay code used in deep space communications uses 24 dimensions. If used as a binary code (which it usually is), the dimensions refer to the length of the codeword as defined above.

The theory of coding uses the N-dimensional sphere model. For example, how many pennies can be packed into a circle on a tabletop or in 3 dimensions, how many marbles can be packed into a globe. Other considerations enter the choice of a code. For example, hexagon packing into the constraint of a rectangular box will leave empty space at the corners. As the dimensions get larger, the percentage of empty space grows smaller. But at certain dimensions, the packing uses all the space and these codes are the so called perfect codes. There are very few of these codes.

Another property is the number of neighbors a single codeword may have.[1] Again, consider pennies as an example. First we pack the pennies in a rectangular grid. Each penny will have 4 near neighbors (and 4 at the corners which are farther away). In a hexagon, each penny will have 6 near neighbors. Respectively, in three and four dimensions, the maximum packing is given by the 12-face and 24-cell with 12 and 24 neighbors, respectively. When we increase the dimensions, the number of near neighbors increases very rapidly. In general, the value is given by the kissing numbers.

The result is the number of ways for noise to make the receiver choose a neighbor (hence an error) grows as well. This is a fundamental limitation of block codes, and indeed all codes. It may be harder to cause an error to a single neighbor, but the number of neighbors can be large enough so the total error probability actually suffers.[1]

See also

Notes

References

  1. ^ a b Christian Schlegel and Lance Pérez (2004). Trellis and turbo coding. Wiley-IEEE. p. 73. ISBN 9780471227557.

External links


Retrieved from "https://en.wikipedia.org/w/index.php?title=Block_code&oldid=417927264"

Categories: 
Articles needing cleanup from May 2010
Cleanup tagged articles with a reason field from May 2010
Wikipedia pages needing cleanup from May 2010
Coding theory
Hidden categories: 
Articles with invalid date parameter in template
All pages needing cleanup
Articles needing additional references from September 2008
All articles needing additional references
CS1 errors: extra text: edition
CS1 errors: unsupported parameter
CS1 errors: ISBN
 



This page was last edited on 9 March 2011, at 08:44 (UTC).

This version of the page has been revised. Besides normal editing, the reason for revision may have been that this version contains factual inaccuracies, vandalism, or material not compatible with the Creative Commons Attribution-ShareAlike License.



Privacy policy

About Wikipedia

Disclaimers

Contact Wikipedia

Code of Conduct

Developers

Statistics

Cookie statement

Mobile view



Wikimedia Foundation
Powered by MediaWiki