Efficient Learning Machines: Theories, Concepts, and Applications for Engineers and System Designers

Book description

Machine learning techniques provide cost-effective alternatives to traditional methods for extracting underlying relationships between information and data and for predicting future events by processing existing information to train models. Efficient Learning Machines explores the major topics of machine learning, including knowledge discovery, classifications, genetic algorithms, neural networking, kernel methods, and biologically-inspired techniques.

Mariette Awad and Rahul Khanna’s synthetic approach weaves together the theoretical exposition, design principles, and practical applications of efficient machine learning. Their experiential emphasis, expressed in their close analysis of sample algorithms throughout the book, aims to equip engineers, students of engineering, and system designers to design and create new and more efficient machine learning systems. Readers of Efficient Learning Machines will learn how to recognize and analyze the problems that machine learning technology can solve for them, how to implement and deploy standard solutions to sample problems, and how to design new systems and solutions.

Advances in computing performance, storage, memory, unstructured information retrieval, and cloud computing have coevolved with a new generation of machine learning paradigms and big data analytics, which the authors present in the conceptual context of their traditional precursors. Awad and Khanna explore current developments in the deep learning techniques of deep neural networks, hierarchical temporal memory, and cortical algorithms.

Nature suggests sophisticated learning techniques that deploy simple rules to generate highly intelligent and organized behaviors with adaptive, evolutionary, and distributed properties. The authors examine the most popular biologically-inspired algorithms, together with a sample application to distributed datacenter management. They also discuss machine learning techniques for addressing problems of multi-objective optimization in which solutions in real-world systems are constrained and evaluated based on how well they perform with respect to multiple objectives in aggregate. Two chapters on support vector machines and their extensions focus on recent improvements to the classification and regression techniques at the core of machine learning.

Table of contents

  1. Cover
  2. Title
  3. Copyright
  4. About ApressOpen
  5. Dedication
  6. Contents at a Glance
  7. Contents
  8. About the Authors
  9. About the Technical Reviewers
  10. Acknowledgments
  11. Chapter 1: Machine Learning
    1. Key Terminology
    2. Developing a Learning Machine
    3. Machine Learning Algorithms
      1. Popular Machine Learning Algorithms
      2. C4.5
      3. k-Means
      4. Support Vector Machines
      5. Apriori
      6. Estimation Maximization
      7. PageRank
      8. AdaBoost (Adaptive Boosting)
      9. k-Nearest Neighbors
      10. Naive Bayes
      11. Classification and Regression Trees
    4. Challenging Problems in Data Mining Research
      1. Scaling Up for High-Dimensional Data and High-Speed Data Streams
      2. Mining Sequence Data and Time Series Data
      3. Mining Complex Knowledge from Complex Data
      4. Distributed Data Mining and Mining Multi-Agent Data
      5. Data Mining Process-Related Problems
      6. Security, Privacy, and Data Integrity
      7. Dealing with Nonstatic, Unbalanced, and Cost-Sensitive Data
    5. Summary
    6. References
  12. Chapter 2: Machine Learning and Knowledge Discovery
    1. Knowledge Discovery
      1. Classification
      2. Clustering
      3. Dimensionality Reduction
      4. Collaborative Filtering
    2. Machine Learning: Classification Algorithms
      1. Logistic Regression
      2. Random Forest
      3. Hidden Markov Model
      4. Multilayer Perceptron
    3. Machine Learning: Clustering Algorithms
      1. k-Means Clustering
      2. Fuzzy k-Means (Fuzzy c-Means)
      3. Streaming k-Means
    4. Machine Learning: Dimensionality Reduction
      1. Singular Value Decomposition
      2. Principal Component Analysis
      3. Lanczos Algorithm
    5. Machine Learning: Collaborative Filtering
      1. User-Based Collaborative Filtering
      2. Item-Based Collaborative Filtering
      3. Alternating Least Squares with Weighted-λ-Regularization
    6. Machine Learning: Similarity Matrix
      1. Pearson Correlation Coefficient
      2. Spearman Rank Correlation Coefficient
      3. Euclidean Distance
      4. Jaccard Similarity Coefficient
    7. Summary
    8. References
  13. Chapter 3: Support Vector Machines for Classification
    1. SVM from a Geometric Perspective
    2. SVM Main Properties
    3. Hard-Margin SVM
    4. Soft-Margin SVM
    5. Kernel SVM
    6. Multiclass SVM
    7. SVM with Imbalanced Datasets
    8. Improving SVM Computational Requirements
    9. Case Study of SVM for Handwriting Recognition
      1. Preprocessing
      2. Feature Extraction
      3. Hierarchical, Three-Stage SVM
      4. Experimental Results
      5. Complexity Analysis
    10. References
  14. Chapter 4: Support Vector Regression
    1. SVR Overview
    2. SVR:Concepts,Mathematical Model, andGraphical Representation
    3. Kernel SVR and Different Loss Functions: Mathematical Model and Graphical Representation
    4. Bayesian Linear Regression
    5. Asymmetrical SVR for Power Prediction: Case Study
    6. References
  15. Chapter 5: Hidden Markov Model
    1. Discrete Markov Process
      1. Definition 1
      2. Definition 2
      3. Definition 3
    2. Introduction to the Hidden Markov Model
      1. Essentials of the Hidden Markov Model
      2. The Three Basic Problems of HMM
      3. Solutions to the Three Basic Problems of HMM
    3. Continuous Observation HMM
      1. Multivariate Gaussian Mixture Model
      2. Example: Workload Phase Recognition
      3. Monitoring and Observations
      4. Workload and Phase
      5. Mixture Models for Phase Detection
    4. References
  16. Chapter 6: Bioinspired Computing: Swarm Intelligence
    1. Applications
      1. Evolvable Hardware
      2. Bioinspired Networking
      3. Datacenter Optimization
    2. Bioinspired Computing Algorithms
    3. Swarm Intelligence
      1. Ant Colony Optimization Algorithm
      2. Particle Swarm Optimization
      3. Artificial Bee Colony Algorithm
    4. Bacterial Foraging Optimization Algorithm
    5. Artificial Immune System
    6. Distributed Management in Datacenters
      1. Workload Characterization
      2. Thermal Optimization
      3. Load Balancing
      4. Algorithm Model
    7. References
  17. Chapter 7: Deep Neural Networks
    1. Introducting ANNs
      1. Early ANN Structures
      2. Classical ANN
      3. ANN Training and the Backpropagation Algorithm
    2. DBN Overview
    3. Restricted Boltzmann Machines
    4. DNN-Related Research
      1. DNN Applications
      2. Parallel Implementations to Speed Up DNN Training
      3. Deep Networks Similar to DBN
    5. References
  18. Chapter 8: Cortical Algorithms
    1. Cortical Algorithm Primer
      1. Cortical Algorithm Structure
      2. Training of Cortical Algorithms
    2. Weight Update
      1. The workflow for CA training is displayed in Figure 8-4.
      2. Experimental Results
    3. Modified Cortical Algorithms Applied to Arabic Spoken Digits: Case Study
      1. Entropy-Based Weight Update Rule
      2. Experimental Validation
    4. References
  19. Chapter 9: Deep Learning
    1. Overview of Hierarchical Temporal Memory
    2. Hierarchical Temporal Memory Generations
      1. Sparse Distributed Representation
      2. Algorithmic Implementation
      3. Spatial Pooler
      4. Temporal Pooler
    3. Related Work
    4. Overview of Spiking Neural Networks
      1. Hodgkin-Huxley Model
      2. Integrate-and-Fire Model
      3. Leaky Integrate-and-Fire Model
      4. Izhikevich Model
      5. Thorpe’s Model
      6. Information Coding in SNN
      7. Learning in SNN
      8. SNN Variants and Extensions
    5. Conclusion
    6. References
  20. Chapter 10: Multiobjective Optimization
    1. Formal Definition
      1. Pareto Optimality
      2. Dominance Relationship
      3. Performance Measure
    2. Machine Learning: Evolutionary Algorithms
      1. Genetic Algorithm
      2. Genetic Programming
    3. Multiobjective Optimization: An Evolutionary Approach
      1. Weighted-Sum Approach
      2. Vector-Evaluated Genetic Algorithm
      3. Multiobjective Genetic Algorithm
      4. Niched Pareto Genetic Algorithm
      5. Nondominated Sorting Genetic Algorithm
      6. Strength Pareto Evolutionary Algorithm
      7. Strength Pareto Evolutionary Algorithm II
      8. Pareto Archived Evolutionary Strategy
      9. Pareto Envelope-Based Selection Algorithm
      10. Pareto Envelope-Based Selection Algorithm II
      11. Elitist Nondominated Sorting Genetic Algorithm
    4. Example: Multiobjective Optimization
    5. Objective Functions
    6. References
  21. Chapter 11: Machine Learning in Action: Examples
    1. Viable System Modeling
    2. Example 1: Workload Fingerprinting on a Compute Node
      1. Phase Determination
      2. Fingerprinting
      3. Forecasting
    3. Example 2: Dynamic Energy Allocation
      1. Learning Process: Feature Selection
      2. Learning Process: Optimization Planning
      3. Learning Process: Monitoring
    4. Model Training: Procedure and Evaluation
    5. Example 3: System Approach to Intrusion Detection
      1. Modeling Scheme
      2. Intrusion Detection System Architecture
    6. Profiles and System Considerations
    7. Sensor Data Measurements
    8. Summary
    9. References
  22. Index

Product information

  • Title: Efficient Learning Machines: Theories, Concepts, and Applications for Engineers and System Designers
  • Author(s): Mariette Awad, Rahul Khanna
  • Release date: May 2015
  • Publisher(s): Apress
  • ISBN: 9781430259909