Pattern Recognition

Book description

The book offers a thorough introduction to Pattern Recognition aimed at master and advanced bachelor students of engineering and the natural sciences. Besides classification - the heart of Pattern Recognition - special emphasis is put on features, their typology, their properties and their systematic construction. Additionally, general principles that govern Pattern Recognition are illustrated and explained in a comprehensible way. Rather than presenting a complete overview over the rapidly evolving field, the book is to clarifies the concepts so that the reader can easily understand the underlying ideas and the rationale behind the methods. For this purpose, the mathematical treatment of Pattern Recognition is pushed so far that the mechanisms of action become clear and visible, but not farther. Therefore, not all derivations are driven into the last mathematical detail, as a mathematician would expect it. Ideas of proofs are presented instead of complete proofs. From the authors’ point of view, this concept allows to teach the essential ideas of Pattern Recognition with sufficient depth within a relatively lean book.

  • Mathematical methods explained thoroughly
  • Extremely practical approach with many examples
  • Based on over ten years lecture at Karlsruhe Institute of Technology
  • For students but also for practitioners

Table of contents

  1. Cover
  2. Title Page
  3. Copyright
  4. Preface
  5. Contents
  6. List of Tables
  7. List of Figures
  8. Notation
  9. Introduction
  10. 1 Fundamentals and definitions
    1. 1.1 Goals of pattern recognition
    2. 1.2 Structure of a pattern recognition system
    3. 1.3 Abstract view of pattern recognition
    4. 1.4 Design of a pattern recognition system
    5. 1.5 Exercises
  11. 2 Features
    1. 2.1 Types of features and their traits
      1. 2.1.1 Nominal scale
      2. 2.1.2 Ordinal scale
      3. 2.1.3 Interval scale
      4. 2.1.4 Ratio scale and absolute scale
    2. 2.2 Feature space inspection
      1. 2.2.1 Projections
      2. 2.2.2 Intersections and slices
    3. 2.3 Transformations of the feature space
    4. 2.4 Measurement of distances in the feature space
      1. 2.4.1 Basic definitions
      2. 2.4.2 Elementary norms and metrics
      3. 2.4.3 A metric for sets
      4. 2.4.4 Metrics on the ordinal scale
      5. 2.4.5 The Kullback–Leibler divergence
      6. 2.4.6 Tangential distance measure
    5. 2.5 Normalization
      1. 2.5.1 Alignment, elimination of physical dimension, and leveling of proportions
      2. 2.5.2 Lighting adjustment of images
      3. 2.5.3 Distortion adjustment of images
      4. 2.5.4 Dynamic time warping
    6. 2.6 Selection and construction of features
      1. 2.6.1 Descriptive features
      2. 2.6.2 Model-driven features
      3. 2.6.3 Construction of invariant features
    7. 2.7 Dimensionality reduction of the feature space
      1. 2.7.1 Principal component analysis
      2. 2.7.2 Kernelized principal component analysis
      3. 2.7.3 Independent component analysis
      4. 2.7.4 Multiple discriminant analysis
      5. 2.7.5 Dimensionality reduction by feature selection
      6. 2.7.6 Bag of words
    8. 2.8 Exercises
  12. 3 Bayesian decision theory
    1. 3.1 General considerations
    2. 3.2 The maximum a posteriori classifier
    3. 3.3 Bayesian classification
      1. 3.3.1 The Bayesian optimal classifier
      2. 3.3.2 Reference example: Optimal decision regions
      3. 3.3.3 The minimax classifier
      4. 3.3.4 Normally distributed features
      5. 3.3.5 Arbitrarily distributed features
    4. 3.4 Exercises
  13. 4 Parameter estimation
    1. 4.1 Maximum likelihood estimation
    2. 4.2 Bayesian estimation of the class-specific distributions
    3. 4.3 Bayesian parameter estimation
      1. 4.3.1 Least squared estimation error
      2. 4.3.2 Constant penalty for failures
    4. 4.4 Additional remarks on Bayesian classification
    5. 4.5 Exercises
  14. 5 Parameter free methods
    1. 5.1 The Parzen window method
    2. 5.2 The k-nearest neighbor method
    3. 5.3 k-nearest neighbor classification
    4. 5.4 Exercises
  15. 6 General considerations
    1. 6.1 Dimensionality of the feature space
    2. 6.2 Overfitting
    3. 6.3 Exercises
  16. 7 Special classifiers
    1. 7.1 Linear discriminants
      1. 7.1.1 More than two classes
      2. 7.1.2 Nonlinear separation
    2. 7.2 The perceptron
    3. 7.3 Linear regression
    4. 7.4 Artificial neural networks
    5. 7.5 Autoencoders
    6. 7.6 Deep learning
      1. 7.6.1 Historical difficulties and successful approaches
      2. 7.6.2 Unsupervised pre-training
      3. 7.6.3 Stochastic gradient descent
      4. 7.6.4 Rectified linear units
      5. 7.6.5 Convolutional neural networks
    7. 7.7 Support vector machines
      1. 7.7.1 Linear separation with maximum margin
      2. 7.7.2 Dual formulation
      3. 7.7.3 Nonlinear mapping
      4. 7.7.4 The kernel trick
      5. 7.7.5 No linear separability
      6. 7.7.6 Discussion
    8. 7.8 Matched filters
    9. 7.9 Classification of sequences
      1. 7.9.1 Markov models
      2. 7.9.2 Hidden states
    10. 7.10 Exercises
  17. 8 Classification with nominal features
    1. 8.1 Decision trees
      1. 8.1.1 Decision tree learning
      2. 8.1.2 Influence of the features used
    2. 8.2 Random forests
    3. 8.3 String matching
    4. 8.4 Grammars
    5. 8.5 Exercises
  18. 9 Classifier-independent concepts
    1. 9.1 Learning theory
      1. 9.1.1 The central problem of statistical learning
      2. 9.1.2 Vapnik–Chervonenkis learning theory
    2. 9.2 Empirical evaluation of classifier performance
      1. 9.2.1 Receiver operating characteristic
      2. 9.2.2 Multi-class setting
      3. 9.2.3 Theoretical bounds with finite test sets
      4. 9.2.4 Dealing with small datasets
    3. 9.3 Boosting
    4. 9.4 Rejection
    5. 9.5 Exercises
  19. A Solutions to the exercises
    1. A.1 Chapter 1
    2. A.2 Chapter 2
    3. A.3 Chapter 3
    4. A.4 Chapter 4
    5. A.5 Chapter 5
    6. A.6 Chapter 6
    7. A.7 Chapter 7
    8. A.8 Chapter 8
    9. A.9 Chapter 9
  20. B A primer on Lie theory
  21. C Random processes
  22. Bibliography
  23. Glossary
  24. Index

Product information

  • Title: Pattern Recognition
  • Author(s): Jürgen Beyerer, Matthias Richter, Matthias Nagel
  • Release date: December 2017
  • Publisher(s): De Gruyter Oldenbourg
  • ISBN: 9783110537963