6.3. Mixture-of-Experts Modular Networks
An expert-based modular network is built on smaller modules, each representing the behavior of a local or specially tailored pattern space. The most prominent expert-based modular network is the mixture-of-experts (MOE) [162]. The MOE exhibits an explicit relationship with statistical pattern classification methods. Given a pattern, each expert network estimates the pattern's conditional a posteriori probability on the (adaptively tuned or preassigned) feature space. Each local expert network performs multiway classification over K classes by using either K-independent binomial models, each being modeled after one and only one class, or one multinomial model for all classes. The corresponding output of ...
Get Biometric Authentication: A Machine Learning Approach now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.