7
$\begingroup$

I have a feeling that the answer is yes, based upon my own preview of it, the way it is cited in more recent works, and of course the many mentions here.

I have read other classic texts but still looking for interesting proofs, insights, etc from the statistical perspective on the learning problem, function approximation, etc.

By the way, I’m talking about Pattern Recognition and Neural Networks.

Edit: thanks to the folks who have engaged. I think this is even a more general question about folks going back to old sources for insight. It's something I love, so I'm glad that others see the value in it too (e.g., see Peter's answer).

$\endgroup$
8
  • 2
    $\begingroup$ What criteria are you using to judge the relevance of a textbook? $\endgroup$ Commented Dec 3 at 1:38
  • $\begingroup$ Hmm. Yeah I need to think about that. I’d say, I want to know if it’s offering some insight on statistical foundations / has interesting proofs or examples that still hold $\endgroup$ Commented Dec 3 at 1:53
  • $\begingroup$ Haven't read the book, just checked the contents, but I would say that any book on pattern recognition now would need to have things on convolutional networks, residual connections, using multiple heads on pretrained bulk of the network. Without it, I would say the book would lose a lot of relevance. $\endgroup$ Commented Dec 3 at 1:55
  • 2
    $\begingroup$ @RickHass, not my field of study. But one thing I learnt is that old books do contain some treatment that are otherwise missing in newer literature and the former could be more insightful. I have encountered such things frequently. So even though they might be bit "outdated", that won't nullify them for their usage in the future, at least that's my take as a regular user of not so commonly used books. $\endgroup$ Commented 2 days ago
  • 1
    $\begingroup$ @User1865345 agreed! I also enjoy seeing the evolution of how things are treated. What is similar and what has changed, etc $\endgroup$ Commented 2 days ago

4 Answers 4

7
$\begingroup$

I don't know this book, but, given that you ask about "insight" I'm going to say, "yes".

First, Ripley is a very smart guy. It's probably worth reading what he has to say.

Second, insight is insight. There are much older books than this that are full of insight. Exploratory Data Analysis by John Tukey is still full of insight and it was written in 1977 - that's just the first one that comes to mind, there are others.

In a comment you say you want proofs. Well, proofs are proofs forever (see Pythagoras or Euclid).

If you asking whether Ripley's is the most relevant book, or the one to use for a main source, well, I wouldn't know, but I'd guess it isn't.

$\endgroup$
1
  • 2
    $\begingroup$ +1 Thanks Peter! Your answer is right on as far as my intuition $\endgroup$ Commented Dec 3 at 13:18
7
$\begingroup$

Yes, it is a good book (I've been re-reading it recently), but I found the mathematical notation rather difficult. I'd say Chris Bishop's book on Pattern Recognition using Neural Networks probably covers most of the same material in a more understandable way (and the netlab toolbox for MATLAB is nice, especially combined with Ian Nabney's book explaining how it all works).

Reading a lot of material online you would think that neural networks were invented in about 2014, but there is a lot of knowledge of the basics that is missing in modern tutorials etc. I'd also recommend the "Tricks of the Trade" book as worth reading (quality of the chapters is variable, but some are very good).

$\endgroup$
1
  • 1
    $\begingroup$ Yeah I enjoy the Bishop book but I actually feel like it’s heavily weighted towards engineering. I guess I’ll have to do the comparison myself! $\endgroup$ Commented 2 days ago
5
$\begingroup$

It's too long ago that I read it so don't remember enough what exactly was in it (I had it borrowed for some time from the university library - it was the old days), but I remember I learnt some stuff from it better than from other sources. It's an insightful book. Sure, modern stuff is not treated and some stuff is outdated, still knowing about the basics, much of which are quite old, is worth something. There's progress in data science, but occasionally also the wheel is reinvented, or at least used for new stuff, so it pays off to properly understand how the wheel works so to say.

$\endgroup$
1
  • 2
    $\begingroup$ What a fun time it was when university libraries still had books! Seems like it’s more and more Springer e-book access these days ;) $\endgroup$ Commented 2 days ago
2
$\begingroup$

Any book on neural nets written before (say) AlexNet in 2011 is going to be missing something important -- neural networks didn't really work usefully back when BDR was writing this book (I learned about trees and nets and similar from him in 1992). The same is true for Elements of Statistical Learning, written by the Stanford folks in 2008.

There's a lot of useful material in both books, but it's a pity there isn't a similar book written by statisticians after the advent of deep convolutional nets and attention and transformers and so on

$\endgroup$
3
  • $\begingroup$ Agreed. Kevin Murphy’s book is quite good but it’s not the same. Perhaps everyone has moved on? $\endgroup$ Commented yesterday
  • $\begingroup$ Sorry, but I don't agree with the first part - neural networks worked just fine prior to 2011 (I trained my first one in 1990). The thing that changed is the range of problems for which neural networks were useful, and the major difference was the improvement in hardware that enabled larger amounts of training data rather than algorithmic developments (at least until GANS, attention etc). $\endgroup$ Commented yesterday
  • $\begingroup$ I certainly agree with the second paragraph though (work on benign overfitting, for example)! $\endgroup$ Commented yesterday

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.