Errata
The errata list is a list of errors and their corrections that were found after the product was released. If the error was corrected in a later version or reprint the date of the correction will be displayed in the column titled "Date Corrected".
The following errata were submitted by our customers and approved as valid errors by the author or editor.
Color key: Serious technical mistake Minor technical mistake Language or formatting error Typo Question Note Update
Version | Location | Description | Submitted By | Date submitted | Date corrected |
---|---|---|---|---|---|
Printed, PDF, ePub | Page 47 The paragraph before the section "System-Specific Error Correction" |
In Chapter 2, under "text extraction and cleanup" -> "spelling correction": For example, if ‘“Hello” is a valid word that is already present in the dictionary, then the addition of “o” (minimal) to “Hllo” would make the correction. -> Change "o" to "e" For example, if ‘“Hello” is a valid word that is already present in the dictionary, then the addition of “e” (minimal) to “Hllo” would make the correction. Note from the Author or Editor: |
Jin Tao | Jul 04, 2020 | |
Printed, PDF, ePub | Page 65 the 2nd line in that page |
Pre-process your input to the ML model If the heuristic has a really *high prediction* for a particular kind of class, then it’s best to use it before feeding the data in your ML model. For instance, if for certain words in an email, there’s a 99% chance that it’s spam, then it’s best to classify that email as spam instead of sending it to an ML model. -> I guess a noun is left out after "high prediction" or say "If the heuristic predicts really well for a particular kind of class". Again, what a great book. Thank you for making such a great impact for people in this field! A lot of thanks. As it says, this book is more on the practical side, any recommendations on other good resources for going deeper? Checking the references if interested? I cannot appreciate it more. Note from the Author or Editor: |
Jin Tao | Jul 04, 2020 | |
Printed, PDF, ePub | Page 98 or 99 the 1st paragraph in the section CBOW |
Given a sentence of, say, m words, it assigns a probability Pr(w1, w2, ….., wn) to the whole sentence. -> Given a sentence of, say, n words, it assigns a probability Pr(w1, w2, ….., wn) to the whole sentence. Note from the Author or Editor: |
Jin Tao | Jul 04, 2020 | |
Printed, PDF, ePub | Page 190 second paragraph in "case study" section. |
an email my contain multiple meeting mentions, like in this example: “MountLogan was a good venue. Let us meet there tomorrow and have an all hands in MountRainer on Thursday.” -> in "an email my contain multiple meeting mentions", change "my" to "might" Note from the Author or Editor: |
Jin Tao | Jul 12, 2020 |