The Back Cover
Information theory and inference, often taught separately, are here united in one
entertaining textbook. These topics lie at the heart of many exciting areas of
contemporary science and engineering - communication, signal processing, data mining,
machine learning, pattern recognition, computational neuroscience, bioinformatics, and
cryptography.
This textbook introduces theory in tandem with applications. Information theory is
taught alongside practical communication systems, such as arithmetic coding for data
compression and sparse-graph codes for error-correction. A toolbox of inference
techniques, including message-passing algorithms, Monte Carlo methods, and variational
approximations, are developed alongside applications of these tools to clustering,
convolutional codes, independent component analysis, and neural networks.
The final part of the book describes the state of the art in error-correcting codes,
including low-density parity-check codes, turbo codes, and digital fountain codes --
the twenty-first century standards for satellite communications, disk drives, and data
broadcast.
Richly illustrated, filled with worked examples and over 400 exercises, some with
detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and
for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex
provide entertainment along the way.
In sum, this is a textbook on information, communication, and coding for a new
generation of students, and an unparalleled entry point into these subjects for
professionals in areas as diverse as computational biology, financial engineering, and
machine learning.
|
Endorsements by famous people
This is an extraordinary and important book, generous with insight
and rich with detail in statistics, information theory, and
probabilistic modeling across a wide swathe of standard, creatively
original, and delightfully quirky topics. David MacKay is an
uncompromisingly lucid thinker, from whom students, faculty and
practitioners all can learn.
Zoubin Ghahramani and Peter Dayan
Gatsby Computational Neuroscience Unit, University College London.
|
|
An utterly original book that shows the connections between such disparate
fields as information theory and coding, inference, and statistical physics.
Dave Forney, M.I.T.
|
|
An instant classic, covering
everything from Shannon's fundamental theorems to the postmodern theory
of LDPC codes. You'll want two copies of this astonishing book,
one for the office and one for the fireside at home.
Bob McEliece, California Institute of Technology
|
|
|