Computing with Mixed Distance Functions
When dealing with data observations that have multiple features, we should be aware that features can be scaled differently on different scales. In this recipe, we account for that to improve our housing value predictions.
Getting ready
It is important to extend the nearest neighbor algorithm to take into account variables that are scaled differently. In this example, we will show how to scale the distance
function for different variables. Specifically, we will scale the distance
function as a function of the feature variance.
The key to weighting the distance
function is to use a weight matrix. The distance
function written with matrix operations becomes the following formula:
Here, A is a diagonal weight matrix ...
Get TensorFlow Machine Learning Cookbook now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.