Book description
Many industry experts consider unsupervised learning the next frontier in artificial intelligence, one that may hold the key to general artificial intelligence. Since the majority of the world's data is unlabeled, conventional supervised learning cannot be applied. Unsupervised learning, on the other hand, can be applied to unlabeled datasets to discover meaningful patterns buried deep in the data, patterns that may be near impossible for humans to uncover.
Author Ankur Patel shows you how to apply unsupervised learning using two simple, productionready Python frameworks: Scikitlearn and TensorFlow using Keras. With code and handson examples, data scientists will identify difficulttofind patterns in data and gain deeper business insight, detect anomalies, perform automatic feature engineering and selection, and generate synthetic datasets. All you need is programming and some machine learning experience to get started.
 Compare the strengths and weaknesses of the different machine learning approaches: supervised, unsupervised, and reinforcement learning
 Set up and manage machine learning projects endtoend
 Build an anomaly detection system to catch credit card fraud
 Clusters users into distinct and homogeneous groups
 Perform semisupervised learning
 Develop movie recommender systems using restricted Boltzmann machines
 Generate synthetic images using generative adversarial networks
Table of contents

Preface
 A Brief History of Machine Learning
 AI Is Back, but Why Now?
 The Emergence of Applied AI
 Major Milestones in Applied AI over the Past 20 Years
 From Narrow AI to AGI
 Objective and Approach
 Prerequisites
 Roadmap
 Conventions Used in This Book
 Using Code Examples
 O’Reilly Online Learning
 How to Contact Us
 Acknowledgments
 I. Fundamentals of Unsupervised Learning

1. Unsupervised Learning in the Machine Learning Ecosystem
 Basic Machine Learning Terminology
 RulesBased vs. Machine Learning
 Supervised vs. Unsupervised
 Using Unsupervised Learning to Improve Machine Learning Solutions
 A Closer Look at Supervised Algorithms
 A Closer Look at Unsupervised Algorithms
 Reinforcement Learning Using Unsupervised Learning
 Semisupervised Learning
 Successful Applications of Unsupervised Learning
 Conclusion

2. EndtoEnd Machine Learning Project

Environment Setup
 Version Control: Git
 Clone the HandsOn Unsupervised Learning Git Repository
 Scientific Libraries: Anaconda Distribution of Python
 Neural Networks: TensorFlow and Keras
 Gradient Boosting, Version One: XGBoost
 Gradient Boosting, Version Two: LightGBM
 Clustering Algorithms
 Interactive Computing Environment: Jupyter Notebook
 Overview of the Data
 Data Preparation
 Model Preparation
 Machine Learning Models (Part I)
 Evaluation Metrics
 Machine Learning Models (Part II)
 Evaluation of the Four Models Using the Test Set
 Ensembles
 Final Model Selection
 Production Pipeline
 Conclusion

Environment Setup
 II. Unsupervised Learning Using ScikitLearn

3. Dimensionality Reduction
 The Motivation for Dimensionality Reduction
 Dimensionality Reduction Algorithms
 Principal Component Analysis
 Singular Value Decomposition
 Random Projection
 Isomap
 Multidimensional Scaling
 Locally Linear Embedding
 tDistributed Stochastic Neighbor Embedding
 Other Dimensionality Reduction Methods
 Dictionary Learning
 Independent Component Analysis
 Conclusion

4. Anomaly Detection
 Credit Card Fraud Detection
 Normal PCA Anomaly Detection
 Sparse PCA Anomaly Detection
 Kernel PCA Anomaly Detection
 Gaussian Random Projection Anomaly Detection
 Sparse Random Projection Anomaly Detection
 Nonlinear Anomaly Detection
 Dictionary Learning Anomaly Detection
 ICA Anomaly Detection
 Fraud Detection on the Test Set
 Conclusion
 5. Clustering
 6. Group Segmentation
 III. Unsupervised Learning Using TensorFlow and Keras
 7. Autoencoders

8. HandsOn Autoencoder
 Data Preparation
 The Components of an Autoencoder
 Activation Functions
 Our First Autoencoder
 TwoLayer Undercomplete Autoencoder with Linear Activation Function
 Nonlinear Autoencoder
 Overcomplete Autoencoder with Linear Activation
 Overcomplete Autoencoder with Linear Activation and Dropout
 Sparse Overcomplete Autoencoder with Linear Activation
 Sparse Overcomplete Autoencoder with Linear Activation and Dropout
 Working with Noisy Datasets
 Denoising Autoencoder
 Conclusion
 9. Semisupervised Learning
 IV. Deep Unsupervised Learning Using TensorFlow and Keras
 10. Recommender Systems Using Restricted Boltzmann Machines
 11. Feature Detection Using Deep Belief Networks
 12. Generative Adversarial Networks

13. Time Series Clustering
 ECG Data
 Approach to Time Series Clustering
 Time Series Clustering Using kShape on ECGFiveDays
 Time Series Clustering Using kShape on ECG5000
 Time Series Clustering Using kMeans on ECG5000
 Time Series Clustering Using Hierarchical DBSCAN on ECG5000
 Comparing the Time Series Clustering Algorithms
 Conclusion
 14. Conclusion
 Index
 About the Author
Product information
 Title: HandsOn Unsupervised Learning Using Python
 Author(s):
 Release date: March 2019
 Publisher(s): O'Reilly Media, Inc.
 ISBN: 9781492035640
You might also like
book
HandsOn Machine Learning with ScikitLearn, Keras, and TensorFlow, 3rd Edition
Through a recent series of breakthroughs, deep learning has boosted the entire field of machine learning. …
book
HandsOn Machine Learning with ScikitLearn, Keras, and TensorFlow, 2nd Edition
Through a series of recent breakthroughs, deep learning has boosted the entire field of machine learning. …
book
Neural networks and deep learning
Neural networks are at the very core of deep learning. They are versatile, powerful, and scalable, …
book
Think Bayes, 2nd Edition
If you know how to program, you're ready to tackle Bayesian statistics. With this book, you'll …