O'Reilly logo

Optimization Techniques for Solving Complex Problems by Juan Antonio Gomez, Coromoto Leon, Pedro Asasi, Christian Blum, Enrique Alba

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

images CHAPTER 2

Neural Lazy Local Learning

J. M. VALLS, I. M. GALVÁN, and P. ISASI

Universidad Carlos III de Madrid, Spain

2.1 INTRODUCTION

Lazy learning methods [13] are conceptually straightforward approaches to approximating real- or discrete-valued target functions. These learning algorithms defer the decision of how to generalize beyond the training data until a new query is encountered. When the query instance is received, a set of similar related patterns is retrieved from the available training patterns set and is used to approximate the new instance. Similar patterns are chosen by means of a distance metric in such a way that nearby points have higher relevance.

Lazy methods generally work by selecting the k nearest input patterns from the query points. Usually, the metric used is the Euclidean distance. Afterward, a local approximation using the samples selected is carried out with the purpose of generalizing the new instance. The most basic form is the k-nearest neighbor method [4]. In this case, the approximation of the new sample is just the most common output value among the k examples selected. A refinement of this method, called weighted k-nearest neighbor [4], can also be used, which consists of weighting the contribution of each of the k neighbors according to the distance to the new query, giving greater weight to closer neighbors. Another strategy to determine the ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required