Coconote
AI notes
AI voice & video notes
Try for free
📊
Understanding SVM Classifier and Hyperplane
Dec 12, 2024
Lecture Notes: SVM Classifier and Hyperplane Calculation
Introduction
Focus: Finding the equation for a hyperplane with maximal margin using SVM classifier
Previous videos covered SVM classifier algorithm and examples
Given Data Set
3 input vectors with 2 features
Input vector 1: (2, 2), Target = -1
Input vector 2: (4, 5), Target = +1
Input vector 3: (7, 4), Target = +1
Objective: Apply SVM algorithm to find the hyperplane
SVM Classifier Key Concepts
Need to find values for the weight vector and bias
Calculate the alpha vector (set of variables based on number of input vectors):
Alpha values: α1, α2, α3
Conditions:
∑(αᵢyᵢ) = 0
αᵢ > 0
Steps to Calculate Alphas
Calculate Alpha Values
Maximize the function φ(α vector)
Replace n = 3 (number of vectors)
Expand the equation considering combinations (i, j)
Calculate dot products for combinations, e.g., x₁•x₂
Simplify the Equation
Utilize constraint: α1 = α2 + α3
Replace α1 with α2 + α3, simplify equation
Differentiate φ(α vector) w.r.t α2 and α3, equate to 0
Solve for Alpha Values
α2 = 26/121
α3 = -6/121
α1 = α2 + α3 = 20/121
Calculate Weight Vector
Formula: Weight vector = ∑(αᵢyᵢxᵢ) for i = 1 to n
Substitute α and y values:
Final weight vector: (2/11, 6/11)
Calculate Bias
Equation: 1/2(min(W•xᵢ) where yᵢ = +1 + max(W•xᵢ) where yᵢ = -1)
Solve for positive and negative class examples
Bias = 27/11
SVM Classifier Equation
f(x vector) = Weight vector • x vector - Bias
Maximal margin hyperplane: set f(x vector) = 0
Support Vectors and Hyperplane
Alpha values indicate support vectors (positive alphas)
Only x1 and x2 are support vectors
Hyperplane passes through the midpoint of support vectors
Conclusion
Successfully applied SVM classifier to find hyperplane with maximum margin
Hyperplane should pass through midpoint of support vectors and be perpendicular
Call to Action
Like, share, and subscribe for more content on SVM and similar topics.
📄
Full transcript