Alternate blurbs
Jacket blurb (215 words)
Information theory and inference, often taught separately, are
here united in one entertaining textbook. These topics lie at the
heart of many exciting areas of contemporary science and engineering
-- communication, signal processing, data mining, machine learning,
pattern recognition, computational neuroscience, bioinformatics, and
cryptography.
This textbook introduces theory in tandem with applications.
Information theory is taught alongside practical communication
systems, such as arithmetic coding for data compression and
sparse-graph codes for error-correction. A toolbox of inference
techniques, including message-passing algorithms, Monte Carlo
methods, and variational approximations, are developed alongside
applications of these tools to clustering, convolutional codes,
independent component analysis, and neural networks.
The final part of the book describes the state of the art in
error-correcting codes, including low-density parity-check codes,
turbo codes, and digital fountain codes -- the twentyfirst-century
standards for satellite communications, disk drives, and data
broadcast.
Richly illustrated, filled with worked examples and over 400
exercises, some with detailed solutions, the book is ideal for
self-learning and for undergraduate or graduate courses. Interludes
on crosswords, evolution, and sex provide entertainment along the
way.
The result is a textbook on information, communication, and
coding for a new generation of students, and an unparalleled entry
point into these subjects for professionals in areas as diverse as
computational biology, financial engineering, and machine learning.
old jacket blurb
Information theory, probabilistic reasoning, coding theory and
algorithmics lie at the heart of some of the most exciting areas of
contemporary science and engineering. They are integral to such areas
as communication, signal processing, data mining, machine learning,
pattern recognition, computational neuroscience, bioinformatics, and
cryptography. David MacKay breaks new ground in this
exciting and entertaining textbook by introducing mathematical
technology in tandem with applications, providing at once both
motivation and hands-on guidance for problem-solving and
modelling. For example, he covers not only the theoretical foundations
of information theory, but practical methods for communication
systems, including arithmetic coding for practical data compression,
and low-density parity-check codes for reliable communication over noisy
channels. The duality between communication and machine learning is
illustrated through data modelling and compression; machine learning
is also represented by clustering, classification, feedforward
networks, Hopfield networks, Boltzmann machines and independent
component analysis. A toolbox of probabilistic techniques
are covered in detail: message-passing, Monte Carlo, and
variational methods. The final part of the book, on
sparse graph codes, describes the state-of-the-art in error-correcting
codes, including chapters on low-density parity-check codes,
turbo codes, and digital fountain codes. There are over 390 exercises,
some with full solutions, which, together with worked examples, extend
the text and enhance technique and understanding. A profusion of
illustrations enliven and complement the text. Interludes on
crosswords, evolution, and sex provide entertaining glimpses on unconventional
applications. In sum, this is a textbook for courses in information,
communication and coding for a new generation of students, and an
unparalleled entry point to these subjects for professionals working
in areas as diverse as computational biology, data mining, financial
engineering and machine learning.
150 word blurb
Information theory, probability, coding and algorithmics lie at the
heart of some of the most dynamic areas of contemporary science and
engineering. David MacKay breaks new ground in this exciting and
entertaining textbook by introducing mathematical technology in tandem
with applications, providing simultaneously both motivation and
hands-on guidance for problem-solving and modelling. For example, he
covers the theoretical foundations of information theory, and
practical methods for communication systems. Communication and machine
learning are linked through data modelling and compression. Over 390
exercises, some with full solutions, and nearly 40 worked examples,
extend the text and enhance technique and understanding. Enlivening
and enlightening illustrations abound. In sum, this is a textbook for
courses in information, communication and coding for a new generation
of students, and an unparalleled entry point to these subjects for
professionals working in areas as diverse as computational biology,
data mining, financial engineering and machine learning.
90 word blurb
Information theory, probabilistic reasoning, coding theory and
algorithmics underpin contemporary science and engineering. David
MacKay breaks new ground in this exciting and entertaining textbook by
introducing mathematics in tandem with applications. Over 390
exercises, some with full solutions, extend the text and enhance
technique and understanding. Enlivening and enlightening illustrations
abound. It's ideal for courses in information, communication and
coding for a new generation of students, and an unparalleled entry
point to these subjects for professionals working in areas as diverse
as computational biology, datamining, financial engineering and
machine learning.
50 word blurb
This exciting and entertaining textbook is
ideal for courses in information, communication and coding for a new
generation of students, and an unparalleled entry point to these
subjects for professionals working in areas as diverse as
computational biology, datamining, financial engineering and machine
learning.
Another old short blurb
This textbook offers comprehensive
coverage of Shannon's theory of information as well as the theory of
neural networks and probabilistic data modelling. It includes
explanations of Shannon's important source encoding theorem and noisy
channel theorem as well as descriptions of practical data compression
systems. Many examples and exercises make the book ideal for students
to use as a class textbook, or as a resource for researchers who need
to work with neural networks or state-of-the-art error-correcting
codes.
Yet another old blurb
This textbook offers comprehensive
coverage of Shannon's theory of information as well as the theory of
neural networks and probabilistic data modeling. Shannon's source
coding theorem and noisy channel theorem are explained and
proved. Accompanying these theoretical results are descriptions of
practical data compression systems including the Huffman coding
algorithm and the less well known arithmetic coding algorithm. The
treatment of neural networks is approached from two perspectives. On
the one hand, the information-theoretic capabilities of some neural
network algorithms are examined, and on the other hand, neural
networks are motivated as statistical models. With many examples and
exercises, this book is ideal for students to use as the text for a
course, or as a resource for researchers who need to work with neural
networks or state-of-the-art error correcting codes.