In our little universe, since we have the data for everyone hence we have easily created a
rule to predict the next diabetic patient. However, in the real world applications, we do not
store the complete dataset of all the patients, and therefore we can borrow the actual power of
ML to find a viable solution for our problems. ML provides prediction even if our dataset does
not contain all the possible samples. For instance, if in our above example, we delete the last
two records. Now, an ML algorithm would process all the attributes of the incoming record
of a person and try to predict whether or not they can contract diabetes or not. This set of all
predictions is referred to as a model.
ML problems are often categorized under regression and classification.
Regression
Regression is used for predicting “continuous” outcomes. In regression, the answer to a
question is determined from the given values of a model, instead of a finite set of labels. When
you search “regression”, you would find many Statistics-based links. This is because it is one of
the fundamental branches of Statistics used for calculating the relationship between variables.
InML, it helps to calculate predictions for events by determining the relationship of the given
input ( variables) in the dataset. Typically, the regression model adheres to the following model.
Prediction Outcome = Coecient 1 + Coecient 2 * Input.
Logistic Regression
When we use the term “logistic” regression, our focus is on the primary function of the algorithm
known as the logistic function. This function is also referred to as the sigmoid function. It is part
of Statistics which is used to flesh out the characteristics for the growth of population in ecology,
understanding its rise, and height for capacity. The function makes an S-shaped curve; as an
input, it considers real numbers and assigns it a value in the range of 0 and 1.
Logistic regression utilizes an equation for representation. The input values are represented
by x where they make use of coecient values or “weights” for estimation of the outcome. This
output is represented by y. Consider the following equation of logistic regression.
y = e^(b0 + b1*x)/(1 + e^(b0 + b1*x))
Sad
Happy
Logistic regression Model
Input:
X1, X2, X3 II Weights:
θ
1,
θ
2,
θ
3, II Outputs: Happ
y or Sad
θ
1
θ
2
θ
3
X2
X1
X3
Chapter 10 Data Analytics and Machine Learning for IoT 255
Internet_of_Things_CH10_pp249-270.indd 255 9/3/2019 10:15:57 AM

Get Internet of Things now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.