This book comprehensively discusses the neural network models from a statistical mechanics perspective. It starts with one of the most influential developments in the theory of neural networks: Hopfield's analysis of networks with symmetric connections using the spin system approach and using the notion of an energy function from physics. Introduction to the Theory of Neural Computation uses these powerful tools to analyze neural networks as associative memory stores and solvers of optimization problems. A detailed analysis of multi-layer networks and recurrent networks follow. The book ends with chapters on unsupervised learning and a formal treatment of the relationship between statistical mechanics and neural networks. Little information is provided about applications and implementations, and the treatment of the material reflects the background of the authors as physicists. However the book is essential for a solid understanding of the computational potential of neural networks. Introduction to the Theory of Neural Computation assumes that the reader is familiar with undergraduate level mathematics, but does not have any background in physics. All of the necessary tools are introduced in the book.