O'Reilly logo

Mastering Machine Learning with Spark 2.x by Michal Malohlava, Max Pumperla, Alex Tellez

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Declaring our stopwords list

Here, we can directly reuse the list of generic English stopwords provided by Spark. However, we can enrich it by our specific stopwords:

import org.apache.spark.ml.feature.StopWordsRemover 
val stopWords= StopWordsRemover.loadDefaultStopWords("english") ++ Array("ax", "arent", "re")

As stated earlier, this is an extremely delicate task and highly dependent on the business problem you are looking to solve. You may wish to add to this list terms that are relevant to your domain that will not help the prediction task.

Declare a tokenizer that tokenizes reviews and omits all stopwords and words that are too short:

val MIN_TOKEN_LENGTH = 3val toTokens= (minTokenLen: Int, stopWords: Array[String],  review: String) => ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required