cdc-coteauxdegaronne
» » Pattern Recognition and Neural Networks
eBook Pattern Recognition and Neural Networks ePub

eBook Pattern Recognition and Neural Networks ePub

by Brian D. Ripley

  • ISBN: 0521717701
  • Category: Computer Science
  • Subcategory: Computers
  • Author: Brian D. Ripley
  • Language: English
  • Publisher: Cambridge University Press; 1 edition (January 28, 2008)
  • Pages: 416
  • ePub book: 1113 kb
  • Fb2 book: 1223 kb
  • Other: mbr lrf txt lrf
  • Rating: 4.9
  • Votes: 824

Description

Ripley brings together two crucial ideas in pattern recognition: statistical methods and machine learning via neural networks

Ripley brings together two crucial ideas in pattern recognition: statistical methods and machine learning via neural networks. Ripley also includes many examples to illustrate real problems in pattern recognition and how to overcome them.

Ripley brings together two crucial ideas in pattern recognition: statistical methods and machine learning via neural networks. He brings unifying principles to the fore.

Cambridge Core - Pattern Recognition and Machine Learning - Pattern Recognition and Neural . Fundamentals of Artificial Neural Networks. IEEE Transactions on Information Theory, Vol. 42, Issue.

Cambridge Core - Pattern Recognition and Machine Learning - Pattern Recognition and Neural Networks - by Brian D. Ripley.

With unparalleled coverage and a wealth of case-studies this book gives valuable insight into both the theory and the enormously diverse applications (which can be found in remote sensing, astrophysics, engineering and medicine, for example).

Neural networks have shown good results for detecting a certain pattern in a given image. In this paper, faster neural networks for pattern detection are presented. Such processors are designed based on cross-correlation in the frequency domain between the input matrix and the input weights of neural networks. This approach is developed to reduce the computation steps required by these faster.

2307/2965437, JSTOR 2965437. Ripley, B. D. (1988). Statistical Inference for Spatial Processes. Cambridge University Press. pp. iv, vii. ISBN 0-521-35234-7

Ripley brings together two crucial ideas in pattern recognition: statistical methods and machine learning via neural networks

Ripley brings together two crucial ideas in pattern recognition: statistical methods and machine learning via neural networks. Categories: Computers\Algorithms and Data Structures: Pattern Recognition.

Ripley brings together two crucial ideas in pattern recognition: statistical methods and machine learning via neural networks. He brings unifying principles to the fore, and reviews the state of the subject. Ripley also includes many examples to illustrate real problems in pattern recognition and how to overcome them.

Comments

Nalmezar Nalmezar
B.D. Ripley is one of the greats in the R community, and also in the applied stat community. Wonderfully clear book.

5 stars for the seller too. Early delivery. Book in great shape, as promised.
GoodLike GoodLike
I studied neural networks 30 years ago, and learned at the time that, although demonstrably useful, they are not "neural" in any sense. Reviewing this well-written tome shows me that no progress has been made on making neural nets any more neural than they were before, by dint of the fact that we still have no idea how neurons work in the fire place. Might as well call them "gravity nets", after another phenomenon we know little about.
Pipet Pipet
Let me start by saying that this book assumes a lot of background, especially in statistics. It dives into the math right away without even a hint or a gentle slope. But what I appreciate is that math is never used for its own sake, it is always justified. The book starts with the introduction to the problems neural nets are to be applied to - pattern recognition task. It proceeds to the elements of statistical decision theory, then goes up to linear discriminant analysis and perceptrons, then up you go to feed-forward neural nets. Non-parametric models and tree-based classifiers are covered next. Belief networks and unsupervised methods (MDS, clustering, etc..) follow. Coverage is extensive, although I would like to see more in the areas of unsupervised learning, which is quite foundational to the whole business.
What sells me on this book quite frankly is that is always keeps an eye on a real-world example. No model or algorithm is introduced without a real-world problem it was intended to solve. You would be better served by the Bishop book (Neural Networks for Pattern Recognition, by C.Bishop ISBN:0198538642) if you are looking for a quick introduction. I would say Ripley's book is the <it>perfect second book on the subject</it>.
I must aplaud the editors and designers of the book. A book itself, apart from the material it covers, is an aestetically most pleasent creation for the somewhat dry subject. Its use of margins is a piece of art - margins are wide, accessible, important points are highlighted there, and you can get to the needed point by flipping the pages quickly. The quality of paper is very good, the book opens wells, and holds its form very well. If you take it seriously and use it often, these qualities will gain in importance.
Captain America Captain America
I concur with the other reviewers. This book requires the reader to be proficient in many different disciplines. It is extremely difficult to digest if you lack an in-depth background in statistics (Bayes theory etc.), calculus and advanced algebra. Many sections of this book were used as a part of Ripley's graduate courses taught at Cambridge where is still a professor of applied statistics. Where I part company with many of the reviewers is that I will not penalize this book for going over my head at times. It is intended for graduate students in statistics or computer science.
The neural network section explains the workings of NNs that are typically hidden to users of NNs in software packages. In some cases a click of a button is all that is needed to do what is explained in considerable depth in this tome. It can be very useful to fully understand what it is that has happened when a program switch is altered. This prevents using a NN and receiving a naive result that makes unfounded predictions.
Breder Breder
If you want a nice up-to-date treatment on neural networks and statistical pattern recognition with lots of nice pictures and an elementary treatment, I recommend the new edition of Duda and Hart. However, neural networks were basically started by the computer-science / artificial intelligence community using analogies to the human nervous system and the perceived connections to the human thought processes. These connections and arguments are weak.
However, a statistical theory of nonlinear classification algorithms shows that these methods have nice properties and have mathematical justification. The statistical pattern recognition research is well over 30 years old and is very well established. So these connections are important for putting neural networks on firm ground and providing greater acceptability from the statistical as well as the engineering community.

Ripley provides a theoretical threatment of the state-of-the-art in statistical pattern recognition. His treatment is thorough, covering all the important developments. He provides a large bibliography and a nice glossary of terms in the back of the book.

Recent papers on neural networks and data mining are often quick to generate results but not very good at providing useful validation techniques that show that perceived performance is not just an artifact of overfitting a model. This is an area where statisticians play a very important role, as they are keenly aware through their experience with regression modeling and prediction, of the crucial need for cross-validation. Ripley covers this quite clearly in Section 2.6 titled "How complex a model do we need?"

It is nice to see the thoroughness of this work. For example, in error rate estimation, many know of the advances of Lachenbruch and Mickey on error rate estimation in discriminant analysis and the further advances of Efron and others with the bootstrap. But in between there was also significant progress by Glick on smooth estimators. This work has been overlooked by many statisticians probably because some of it appears in the engineering literature (but one important paper was in the Journal of the American Statistical Association [JASA] in 1972). To some extent this oversight may be due to the fact that it was not mentioned in Efron's famous 1983 JASA paper and hence is usually missed in the bootstrap literature. Bootstrap methods and cross-validation play a prominent role in this text.

This is an excellent reference book for anyone seriously interested in pattern recognition research. For applied and theoretical statisticians who want a good account of the theory behind neural networks it is a must.