Pattern Recognition and Machine Learning by Christopher M. Bishop


Pattern Recognition and Machine Learning
Title : Pattern Recognition and Machine Learning
Author :
Rating :
ISBN : 0387310738
ISBN-10 : 9780387310732
Language : English
Format Type : Hardcover
Number of Pages : 738
Publication : First published January 1, 2006

Pattern recognition has its origins in engineering, whereas machine learning grew out of computer science. However, these activities can be viewed as two facets of the same field, and together they have undergone substantial development over the past ten years. In particular, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic models. Also, the practical applicability of Bayesian methods has been greatly enhanced through the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation. Similarly, new models based on kernels have had a significant impact on both algorithms and applications. This new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners, and assumes no previous knowledge of pattern recognition or machine learning concepts. Knowledge of multivariate calculus and basic linear algebra is required, and some familiarity with probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.


Pattern Recognition and Machine Learning Reviews


  • Manny

    Dave, who knows about these things, recommended it... I have just ordered a copy.

  • Wooi Hen Yap

    For beginners who need to understand Bayesian perspective on Machine Learning, I'd would say that's the best so far. The author has make good attempt to explain complicated theories in simplified manner by giving examples/applications. The best part of the book are chapters on graphical models (chapter 8), mixture model EM (chap 9) and approximate inference (chap 10). The reason I didn't give 5 stars because it is too narrow a perspective on Machine Learning (only from Bayesian Perspective) that I feel did not terms well with the book title. Statistical learning and non-Bayesian perspective on machine learning are not covered much here. To make up for this discrepancies Tom Mitchell's Machine Learning does better job. Nevertheless, it still a great book to put on the shelve for machine learning.

  • Gavin

    Timeless, towering. My yardstick: The first time I read it (looked at it) I was way out of my depth and understood little. Year by year I misunderstand less of it.

  • Oldrich

    1. The book is mainly about Bayesian approach. And many important techniques are missing. This is the biggest problem I think.
    2. “Inconsistent difficulty”, too much time spent on simple things and very short time spent on complicated stuff.
    3. Lack of techniques demonstration on real world problems.

  • Asim Bakhshi

    An amazing textbook that would never get old.

  • DJ

    recommended reading on machine learning from Gatsby (the neuroscience group in London, not the fictional Roaring 20s tail-chaser)

  • David

    Being a new text, topics in modern machine learning research are covered. Bishop prefers intuitive explanations with lots of figures over mathematical rigor (Which is fine by me! =). A sample chapter is available at Bishop's website.

  • Van Huy

    Took me a year to finish this book :D

  • Fernando Flores

    One of my first book on machine learning, this book can be painful if you don't have a solid background in algebra.

  • Emil Petersen

    I started reading this book about 2 years too late, in my last year of my computer science degree. I have only now finished it, and I had to skim some of the last chapters. It's a pretty monumental task to read it through, and I cannot help but wonder how much it have taken to write it. Bishop has extraordinary insight into the Bayesian treatment in pattern recognition, and this is expressed here in, sometimes excruciating, details. If you're a beginner, I would just read the first 4 or so chapters, maybe chapter 8 and skim some of the variational inference sections. For more advanced learners, the later chapters provide some excellent detail on how to go beyond the basics.

    I'm a little sad that this book was not a part of my official coursework, as I have only later discovered how relevant much of the content was for a significant part of my courses, and even worse, my thesis (where variational autoencoders and, hidden markov models and Bayesian ensemble models were at the center, all of which are either described directly in this book, or given foundation). The variational autoencoder would fit right in (which rose to prominence after the book was written).

    Chapter 5 on neural networks is good, but it feels disconnected from the rest of the book. Still, it's a good chapter in itself, and even though a lot is happening and has happened since the chapter was written, the foundations described here remain the same. People might use ReLU as activation now, and there are a few new tricks, but the foundations remain the same, such as perceptrons, backpropagation and activation functions.

    Bishop is not the most pedagogical author, especially if you read more than the first few chapters, so if you need someone to hold your hand while reading, this is probably not the best place to start. In any case, the book seems great as a reference and if you like this kind of stuff, you should definitely read it at some point.

  • Trung Nguyen

    I consider PRML one of the classic machine learning text books despite its moderate age (only 10 years). The book presents the probabilistic approach to modelling, in particular Bayesian machine learning. The material seems quite intimidating for readers that come from a not-so-strong mathematical background. But once you get over the initial inertia and practice deriving the equations on your own, you'll get a deep understanding of the content.

  • Nick

    Very decent mathematical overview of Data Science/ML with an emphasis on variational methods. It is particularly good intro to Bayesian stats/philosophy with nice pictures which is a good for those who don't know stats that well but are scientists at heart.

    I enjoyed it but I also recommended it many times over to friends who knew far less stats than me and they often were extremely compelled by it (good for teaching).

    It is an intro book, just to note.

  • Mahdi shafiee

    I'm not read whole of book but i believe this is book is one of best reference for machine learning.
    One of weak points is Deep learning not presented.

  • Oleg Dats

    Read it if you want to really understand statistical learning. A fundamental book about fundamental things.

    It is not the easy one but it will pay off.

  • Andreea

    I read this textbook in parallel to my Machine Learning lectures, and it was what I needed to get a deeper understanding of the underlying mathematical and probabilistic concepts, the methodologies and the techniques currently used in Pattern Recognition.

    The book covers the foundations of Machine Learning, as well as more advanced Pattern Recognition techniques. The writing is comprehensible and not extremely dry, the book structure makes sense and the topics follow a natural order. I can't say I understood everything, and I wasn't expecting to - learning isn't linear and that's okay.

    Each chapter of the book builds up your understanding of the techniques, starting with the basics and progressing towards more advanced topics, in a way that is (relatively) easy to follow.

    Such topics include Bayesian, linear, and nonlinear classifiers, clustering, feature selection and generation, classifier evaluation, as well as more specific methods, and doesn't fail to outline their applications.

    The math, while pretty advanced, is very well explained, and with enough focus, you can follow it step by step while it leads you to the formulas that underlie the algorithms and methods used in practice.

  • Avishek Nag

    I observed that people often do mistake by reading this book at first hand in Machine Learning. There's the confusion. This is not at all a beginners' book. You really need some yrs of exp in ML to fully comprehend the breadth & depth of the book. I agree that the language used may sound little complex, but you should not give up there.

    And one thing, this book is not for people looking for hands on exp. No. This one is for somebody who is really interested in the core meaty math stuff and "have some experience in ML". And you definitely need a pen & paper for reading it. This is not for people looking for a crash course. This book will definitely help you building a solid in & out understanding of the core math part of ML.

  • Skillovilla

    Nice book about machine learning and pattern recognition concepts, machine learning is future of technologies ,

    good to read about machine learning ..

    commenting my website where i also shares tutors about machine learning course ..


    Learn the syntactical application of python in data science. Get to grips with statistics, probability, and core mathematical concepts, which are the foundations of data science.

    data science online classes


    online data science degree


    data science course fees

  • John

    First off, it needs to be noted that there are things about this book that are old and should be ignored. Deep learning, and anything involving that, has went way beyond this. The neural network discussion is very old.

    Some of the approaches it discusses are also largely out of favor, as they've been supplanted by other technologies. But things sometimes come around again.

    Beyond that, though, there's a lot of good fundamentals that haven't changed so much.

    As other reviewers note, it is a heavily Bayesian approach, which is something I like.

    I read it a long time ago, was good then, still reads well.

  • Jerzy Baranowski

    This is a kind of a cheat as a science book snuck into my summer reading pile. However I’m reading it to broaden my horizons, not for an exam of sorts, so I think it counts. This is not a book that person can learn how to do machine learning from scratch. However, especially if you are not afraid of mathematics you can understand how does it work. Bishop frames most of machine learning models in in a probabilistic, Bayesian framework. For me it is attractive, however computational methods are a bit dated, as a lot has happened since 2006. Strong recommendation from me.

  • Dhanya Jothimani

    Actual Rating: 4.5

    Recommended for understanding the Bayesian perspective of Machine Learning algorithms but it doesn't give a comparative analysis with Frequentist approach. Good for learning the (theoretical or ) mathematical aspects of algorithms and their graphical representation. Focus on real world applications missing.
    P.S.: Used for teaching Bayesian Statistics and Machine Learning course for graduate students

  • Rick Sam

    Christopher Bishop comes from Theoretical Physics background.

    Take a look at his,
    PhD thesis, Semi-classical technique in field theory : some applications.

    Bishop is extensive writer.
    I went through his work, once!


    Some notes from the work

    Deus Vult
    Gottfried

  • Frederik

    Bishop always makes me feel like an idiot, even when reading about concepts I'm already very familiar with. Very, very dense, with lots of mathematical muscle-flexing that requires you to stare at a page for 30 minutes until you get that "a-ha" moment. Maybe it's because I don't have a stats background, but I just don't enjoy this book.