A classical measure of goodness of a code is the code's
minimum Hamming distance. In fact,
a large part of traditional coding theory is concerned with finding
the fundamental tradeoffs between three parameters: the length of the code, the number of codewords in the code, and the minimum distance of the code. It is wellknown
that the minimum Hamming distance d of a
code reflects its guaranteed errorcorrecting capability in the sense
that any error pattern of weight at most
floor[(d1)/2] can be corrected.
However, most codes can, with high probability, correct error patterns of substantially higher weight. This insight is the cornerstone of modern coding theory which attempts to capitalize on the full correction capability of a code. One of the most successful realizations of this phenomenon is found in binary lowdensity paritycheck (LDPC) codes. These codes come equipped with an iterative messagepassing algorithm to be performed at the receivers end which is extremely efficient and corrects, with high probability, many more error patterns than guaranteed by the minimum distance.
In this situation, we are left with the problem of finding new mathematically precise concepts that can take over the role of minimum Hamming distance for such high performance codes. This concept can be identified as the fundamental cone of a code. As a binary linear code, an LDPC code C is defined by a paritycheck matrix H. The strength of the iterative decoding algorithm, i.e., its low complexity, comes from the fact that the algorithm operates locally on a socalled Tanner graph representing the matrix H. However, this same fact also leads to a fundamental weakness of the algorithm: because it acts locally, the algorithm cannot distinguish if it is acting on the graph itself or on some finite unramified cover of the graph. This leads to the notion of pseudocodewords, which arise from codewords in codes corresponding to the covers and which compromise the decoder. Thus to understand the performance of LDPC codes, we must understand the graph covers and the codes corresponding to them.This is tantamount to understanding a cone in R^n defined by inequalities arising from H, called the fundamental cone.
