Support Vector Machines — Lecture series — Sequential Minimal Optimization Part 4

David Sasu
2 min readOct 12, 2021

In the last series of posts, we have been trying to explore how to solve the sequential minimal optimization algorithm. In the beginning of this algorithm, we have a vector of lagrange multipliers and our objective is to consistently pick 2 of these lagrange multipliers from the vector and change their values such that the constraints of the sequential minimal optimization algorithm are still respected. In the previous post, we looked at the process that we can employ to figure out which two lagrange multipliers should be selected from our given vector of multipliers. In this post we would be exploring exactly how to optimise them after they have been selected.

Learning objective:

How to optimise chosen lagrange multipliers with an analytical approach.

Main question:

What is the analytical approach that I can employ to optimise the chosen lagrange multipliers?

In this analytical approach, there are 2 main formulae which are used to transform the chosen lagrange multipliers into their desired values. Assuming that the 2 chosen lagrange multipliers are alpha_1 and alpha_2 and the target values that are associated with these lagrange multipliers are y1 and y2, the formula that is used under this approach to compute the new value of alpha_2 is:

Where Ei = f(xi)-yi, refers to the difference between output of the hypothesis function f, given a particular example x and its corresponding the target value y and K is a kernel function. In addition to computing the value of alpha_2, certain bounds are also computed for this value such that the value of alpha_2 must not be smaller or larger than these bounds. If the computed value of alpha_2 happens to be larger or smaller than its bounds, it is clipped to fit.

The formula that can be used to compute the value of alpha_1 is however given as:

In the next post, we will be looking at how these 2 formulae were derived and then we will look at our very last topic in this Support Vector Machines series which is Multi-class SVMs :)

--

--