Video description
In Video Editions the narrator reads the book while the content, figures, code listings, diagrams, and text appear on the screen. Like an audiobook that you can also watch as a video.
Did you think machine learning is complicated and hard to master? It’s not! Read this book! Serrano demystifies some of the best-held secrets of the machine learning society.
Sebastian Thrun, Founder, Udacity
Discover valuable machine learning techniques you can understand and apply using just high-school math.
In Grokking Machine Learning you will learn:- Supervised algorithms for classifying and splitting data
- Methods for cleaning and simplifying data
- Machine learning packages and tools
- Neural networks and ensemble methods for complex datasets
Grokking Machine Learning teaches you how to apply ML to your projects using only standard Python code and high school-level math. No specialist knowledge is required to tackle the hands-on exercises using Python and readily available machine learning tools. Packed with easy-to-follow Python-based exercises and mini-projects, this book sets you on the path to becoming a machine learning expert.
about the technology
Discover powerful machine learning techniques you can understand and apply using only high school math! Put simply, machine learning is a set of techniques for data analysis based on algorithms that deliver better results as you give them more data. ML powers many cutting-edge technologies, such as recommendation systems, facial recognition software, smart speakers, and even self-driving cars. This unique book introduces the core concepts of machine learning, using relatable examples, engaging exercises, and crisp illustrations.
about the book
Grokking Machine Learning presents machine learning algorithms and techniques in a way that anyone can understand. This book skips the confused academic jargon and offers clear explanations that require only basic algebra. As you go, you’ll build interesting projects with Python, including models for spam detection and image recognition. You’ll also pick up practical skills for cleaning and preparing data.
about the audience
No machine learning knowledge necessary, but basic Python required.
about the author
Luis G. Serrano is a research scientist in quantum artificial intelligence. Previously, he was a Machine Learning Engineer at Google and Lead Artificial Intelligence Educator at Apple.
The first step to take on your machine learning journey.Millad Dagdoni, Norwegian Labour and Welfare Administration
A nicely written guided introduction, especially for those who want to code but feel shaky in their mathematics.
Erik D. Sapper, California Polytechnic State University
The most approachable introduction to machine learning I’ve had the pleasure to read in recent years. Highly recommended.
Kay Engelhardt, devstats
NARRATED BY MARIANNE SHEEHAN
Table of contents
- Chapter 1. What is machine learning? It is common sense, except done by a computer
- Chapter 1. What is machine learning?
- Chapter 1. Some examples of models that humans use
- Chapter 1. Example 4: More?
- Chapter 2. Types of machine learning
- Chapter 2. Supervised learning: The branch of machine learning that works with labeled data
- Chapter 2. Unsupervised learning: The branch of machine learning that works with unlabeled data
- Chapter 2. Dimensionality reduction simplifies data without losing too much information
- Chapter 2. What is reinforcement learning?
- Chapter 3. Drawing a line close to our points: Linear regression
- Chapter 3. The remember step: Looking at the prices of existing houses
- Chapter 3. Some questions that arise and some quick answers
- Chapter 3. Crash course on slope and y-intercept
- Chapter 3. Simple trick
- Chapter 3. The linear regression algorithm: Repeating the absolute or square trick many times to move the line closer to the points
- Chapter 3. How do we measure our results? The error function
- Chapter 3. Gradient descent: How to decrease an error function by slowly descending from a mountain
- Chapter 3. Real-life application: Using Turi Create to predict housing prices in India
- Chapter 3. Parameters and hyperparameters
- Chapter 4. Optimizing the training process: Underfitting, overfitting, testing, and regularization
- Chapter 4. How do we get the computer to pick the right model? By testing
- Chapter 4. A numerical way to decide how complex our model should be: The model complexity graph
- Chapter 4. Another example of overfitting: Movie recommendations
- Chapter 4. Modifying the error function to solve our problem: Lasso regression and ridge regression
- Chapter 4. An intuitive way to see regularization
- Chapter 4. Polynomial regression, testing, and regularization with Turi Create
- Chapter 4. Polynomial regression, testing, and regularization with Turi Create :The testing RMSE for the models follow:
- Chapter 5. Using lines to split our points: The perceptron algorithm
- Chapter 5. The problem: We are on an alien planet, and we don’t know their language!
- Chapter 5. Sentiment analysis classifier
- Chapter 5. The step function and activation functions: A condensed way to get predictions
- Chapter 5. The bias, the y-intercept, and the inherent mood of a quiet alien
- Chapter 5. Error function 3: Score
- Chapter 5. Pseudocode for the perceptron trick (geometric)
- Chapter 5. Bad classifier
- Chapter 5. Pseudocode for the perceptron algorithm
- Chapter 5. Coding the perceptron algorithm using Turi Create
- Chapter 6. A continuous approach to splitting points: Logistic classifiers
- Chapter 6. The dataset and the predictions
- Chapter 6. Error function 3: log loss
- Chapter 6. Formula for the log loss
- Chapter 6. Pseudocode for the logistic trick
- Chapter 6. Coding the logistic regression algorithm
- Chapter 6. Classifying into multiple classes: The softmax function
- Chapter 7. How do you measure classification models? Accuracy and its friends
- Chapter 7. False positives and false negatives: Which one is worse?
- Chapter 7. Recall: Among the positive examples, how many did we correctly classify?
- Chapter 7. Combining recall and precision as a way to optimize both: The F-score
- Chapter 7. A useful tool to evaluate our model: The receiver operating characteristic (ROC) curve
- Chapter 7. The receiver operating characteristic (ROC) curve: A way to optimize sensitivity and specificity in a model
- Chapter 7. A metric that tells us how good our model is: The AUC (area under the curve)
- Chapter 7. Recall is sensitivity, but precision and specificity are different
- Chapter 7. Summary
- Chapter 8. Using probability to its maximum: The naive Bayes model
- Chapter 8. Sick or healthy? A story with Bayes’ theorem as the hero: Let’s calculate this probability.
- Chapter 8. Prelude to Bayes’ theorem: The prior, the event, and the posterior
- Chapter 8. What the math just happened? Turning ratios into probabilities
- Chapter 8. What the math just happened? Turning ratios into probabilities:Product rule of probabilities
- Chapter 8. What about two words? The naive Bayes algorithm
- Chapter 8. What about more than two words?
- Chapter 8. Implementing the naive Bayes algorithm
- Chapter 9. Splitting data by asking questions: Decision trees
- Chapter 9. Picking a good first question
- Chapter 9. The solution: Building an app-recommendation system
- Chapter 9. Gini impurity index: How diverse is my dataset?
- Chapter 9. Entropy: Another measure of diversity with strong applications in information theory
- Chapter 9. Classes of different sizes? No problem: We can take weighted averages
- Chapter 9. Beyond questions like yes/no
- Chapter 9. The graphical boundary of decision trees
- Chapter 9. Setting hyperparameters in Scikit-Learn
- Chapter 9. Applications
- Chapter 10. Combining building blocks to gain more power: Neural networks
- Chapter 10. Why two lines? Is happiness not linear?
- Chapter 10. The boundary of a neural network
- Chapter 10. Potential problems: From overfitting to vanishing gradients
- Chapter 10. Neural networks with more than one output: The softmax function
- Chapter 10. Training the model
- Chapter 10. Other architectures for more complex datasets
- Chapter 10. How neural networks paint paintings: Generative adversarial networks (GAN)
- Chapter 11. Finding boundaries with style: Support vector machines and the kernel method
- Chapter 11. Distance error function: Trying to separate our two lines as far apart as possible
- Chapter 11. Training SVMs with nonlinear boundaries: The kernel method
- Chapter 11. Going beyond quadratic equations: The polynomial kernel
- Chapter 11. A measure of how close points are: Similarity
- Chapter 11. Overfitting and underfitting with the RBF kernel: The gamma parameter
- Chapter 12. Combining models to maximize results: Ensemble learning
- Chapter 12. Fitting a random forest manually
- Chapter 12. Combining the weak learners into a strong learner
- Chapter 12. Gradient boosting: Using decision trees to build strong learners
- Chapter 12. XGBoost similarity score: A new and effective way to measure similarity in a set
- Chapter 12. Building the weak learners: Split at 25
- Chapter 12. Tree pruning: A way to reduce overfitting by simplifying the weak learners
- Chapter 13. Putting it all in practice: A real-life example of data engineering and machine learning
- Chapter 13. Using Pandas to study our dataset
- Chapter 13. Turning categorical data into numerical data: One-hot encoding
- Chapter 13. Feature selection: Getting rid of unnecessary features
- Chapter 13. Testing each model’s accuracy
- Chapter 13. Tuning the hyperparameters to find the best model: Grid search
Product information
- Title: Grokking Machine Learning, video edition
- Author(s):
- Release date: December 2021
- Publisher(s): Manning Publications
- ISBN: None
You might also like
video
Machine Learning Bookcamp, Video Edition
In Video Editions the narrator reads the book while the content, figures, code listings, diagrams, and …
book
Machine Learning with PyTorch and Scikit-Learn
This book of the bestselling and widely acclaimed Python Machine Learning series is a comprehensive guide …
video
The Essential Machine Learning Foundations: Math, Probability, Statistics, and Computer Science (Video Collection)
27+ Hours of Video Instruction An outstanding data scientist or machine learning engineer must master more …
video
Deep Learning with TensorFlow, Keras, and PyTorch
7+ Hours of Video Instruction An intuitive, application-focused introduction to deep learning and TensorFlow, Keras, and …