The AdaBoost Algorithm

  • Initialize:
  • Initialize: m = 1
  • Repeat
    • Compute optimum θm in ϕ(·; θm) by minimizing Pm; (4.135)
    • Compute the optimum Pm; (4.135)
    • Zm = 0.0
    • For i = 1 to N
      • w(m + 1)i = w(m)i exp(-yiαmϕ(xi; θm))
      • Zm = Zm + w(m + 1)i
    • End{For}
    • For i = 1 to N
      • w(m + 1)i = w(m + 1)i/Zm
    • End {For}
    • K = m
    • m = m + 1
  • Until a termination criterion is met.

One of the main and very interesting properties of boosting is its relative immunity to ...

Get Pattern Recognition, 4th Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.