The problem of how to reliably transmit data is of enormous significance in the modern world. Communications channels are invariably noisy, and error-correcting codes incorporating redundancy have been developed for fast, reliable data transmission. The limit to how good these codes can get is imposed by Shannon’s law. Modern codes, such as turbo codes and LDPC codes, are now ubiquitous in mobile phone and digital satellite television transmissions, and provide almost perfect reliability at a rate close to capacity (the Shannon limit). Compared to earlier codes, such codes can double data throughput for a given transmitting power or, alternatively, can halve the requisite transmitting power for a given data rate. However, although such codes currently perform close to capacity, their performance can only be empirically, and there is no guarantee that they will preserve these properties in the future as computational resources and codeword size increase.
Yale University researchers have now invented a collection of practical codes which have rates near capacity and which are also supported by an elegant theory demonstrating that these codes scale to nearly optimal no matter what the computational level. In a breakthrough in the field of communications, this is the first coding method that is demonstrated (i) to have the communication rate close to channel capacity at all scales of code size and (ii) to have the error probability exponentially small as a function a function of the size of the code. This theory applies to the real-world case of additive white Gaussian noise. The predictable scaling of these sparse superposition codes makes them excellent candidates for use in future communications protocols.
A. R. Barron, A. Joseph (2010). Towards fast reliable communication at rates near capacity with Gaussian noise. Proc. IEEE International Symposium on Information Theory. Austin, Texas, June 13-18, 2010.
A. R. Barron and A. Joseph (2011). Sparse Superposition Codes: Fast and Reliable at Rates Approaching Capacity with Gaussian Noise. February, 2011.