I agree with “agreeing to disagree”

Offensive language detection on social media is one of the prime examples of the application of Natural Language Processing in real-world contexts. To perform this task, machine learning models are trained with high volumes of data from selected social media platforms. …

Support Vector Machines — Lecture series — Sequential Minimal Optimization Part 4

In the last series of posts, we have been trying to explore how to solve the sequential minimal optimization algorithm. In the beginning of this algorithm, we have a vector of lagrange multipliers and our objective is to consistently pick 2 of these lagrange multipliers from the vector and change…

Support Vector Machines — Lecture series — Sequential Minimal Optimization Part 3

In the previous post, I spoke about the two main steps involved in the implementation of the sequential minimal optimization algorithm, which include first selecting the Lagrange multipliers to optimize and then optimizing the chosen Lagrange multipliers through the implementation of an analytical method. In this post, I would be…

Support Vector Machines — Lecture series — Sequential Minimal Optimization Part 2

In the previous post, I gave a general overview on what the Sequential Minimal Optimization algorithm was and how it works. In the next subsequent posts, we would be breaking down the concepts regarding the SMO algorithm into manageable chunks that you can easily assimilate, understand and enjoy. …

Support Vector Machines — Lecture series — An introduction to the Sequential Minimal Optimization(SMO) Algorithm

In the previous posts, we were introduced to the SVM optimization problem, which is demonstrated in Fig. 1 below:

This optimization problem can be solved by using a convex optimization package such as CVXOPT. However, solving this problem with such a package becomes problematic when we are dealing with large…

Support Vector Machines — Lecture series — Types of kernels

In the previous post, we had a look at the kernel trick and how it can be applied mathematically to enable us to find an optimum hyperplane that can separate non-linearly separable data points. In this post, we shall be exploring certain types of popular kernels.

Learning objective:

Understand some…

Support Vector Machines — Lecture series — Kernels part 3 (The Kernel trick)

In this tutorial, we would be learning about how to directly apply the kernel to help us to find the optimum hyperplane between data points belonging to different classes which are not linearly separable in nature. …

Support Vector Machines — Lecture series — Kernels part 2

In the last post, I discussed the problem that is involved with transforming every point from one dimension to another dimension just to be able to find the right hyperplane that can separate the points into their respective classes. This is a computationally expensive task to do, especially if there…

Support Vector Machines — Lecture series — Kernels: An introduction

In the last couple of lecture posts, we have been talking about how to determine hyperplanes that can optimally separate given datasets. In this the next couple of posts, we will look at the concept of kernels, what they are and how they are so useful in machine learning.

Learning…

Support Vector Machines — Lecture series — Karush-Kahn-Tucker conditions part 6

In the last post, we looked at the idea of “complementary slackness” and how it showcases the relationship between the variables of the primal problem and the constraints of the dual problem and vice versa. …

Get the Medium app