Information Theory, Inference, and Learning Algorithms is available free online.
Book Description
This book is divided into six parts as Data Compression, Noisy-Channel Coding, Further Topics in Information Theory, Probabilities and Inference, Neural networks, Sparse Graph Codes.
Table of Contents
- Introduction to Information Theory
- Probability, Entropy, and Inference
- More about Inference
- The Source Coding Theorem
- Symbol Codes
- Stream Codes
- Codes for Integers
- Dependent Random Variables
- Communication over a Noisy Channel
- The Noisy-Channel Coding Theorem
- Error-Correcting Codes and Real Channels
- Hash Codes: Codes for Efficient Information Retrieval
- Binary Codes
- Very Good Linear Codes Exist
- Further Exercises on Information Theory
- Message Passing
- Communication over Constrained Noiseless Channels
- Crosswords and Codebreaking
- Why have Sex? Information Acquisition and Evolution
- An Example Inference Task: Clustering
- Exact Inference by Complete Enumeration
- Maximum Likelihood and Clustering
- Useful Probability Distributions
- Exact Marginalization
- Exact Marginalization in Trellises
- Exact Marginalization in Graphs
- Laplace’s Method
- Model Comparison and Occam’s Razor
- Monte Carlo Methods
- Efficient Monte Carlo Methods
- Ising Models
- Exact Monte Carlo Sampling
- Variational Methods
- Independent Component Analysis and Latent Variable Modelling
- Random Inference Topics
- Decision Theory
- Bayesian Inference and Sampling Theory
- Introduction to Neural Networks
- The Single Neuron as a Classifier
- Capacity of a Single Neuron
- Learning as Inference
- Hopfield Networks
- Boltzmann Machines
- Supervised Learning in Multilayer Networks
- Gaussian Processes
- Deconvolution
- Low-Density Parity-Check Codes
- Convolutional Codes and Turbo Codes
- Repeat-Accumulate Codes
- Digital Fountain Codes