KEMBAR78
SVM(support vector Machine)withExplanation.pdf
Support Vector Machine
• Support Vector Machine or SVM is one of the
most popular Supervised Learning algorithms,
• which is used for Classification as well as
Regression problems.
• However, primarily, it is used for Classification
problems in Machine Learning.
SVM
Prediction of SVM
Understanding
SVM
Confidence-building
strategies
KEY TERMS
• Hyperplane:
• SVM identifies the optimal hyperplane that maximizes the
margin between classes in the feature space.
• Especially useful for scenario where linear separation is not
feasible.
• Support Vector Machine:
• Support Vector are the data points closets to the hyperplane.
• They play a pivotal role in determining the position and
orientation of the optimal hyperplane.
Margin: Margin
is distance from the
decision surface to the
closest data point Positive
hyperplane + Negative
Hyperplane=Margin
Suppose D1+D2=M
GOAL
The goal of the SVM algorithm is to create the best line or
decision boundary that can segregate n dimensional space into
classes so that we can easily put the new data point in the
correct category in the future. This best decision boundary is
called a hyperplane.
Types of SVM
Linear SVM: Linear SVM
is used for linearly separable
data, which means if a
dataset can be classified
into two classes by using a
single straight line, then such
data is termed as linearly
separable data, and
classifier is used called as
Linear SVM classifier.
Non-linear SVM
Non-Linear SVM is used
for non-linearly
separated data, which
means if a dataset
cannot be classified by
using a straight line,
then such data is
termed as nonlinear
data and classifier used
is called as Non-linear
SVM classifier
Kernal Trick
• SVM employs the Kernal trick to handle no-linear
relationships in data by mapping it into a higher
dimensional space.
• Enables SVM to capture complex patterns and
make accurate prediction.
Drawback
• Not suitable for large datasets
• Large training time
• More features, more complexities
• Bad performance on high noise
• Choice of kernel
• Computationally expensive

SVM(support vector Machine)withExplanation.pdf

  • 1.
  • 2.
    • Support VectorMachine or SVM is one of the most popular Supervised Learning algorithms, • which is used for Classification as well as Regression problems. • However, primarily, it is used for Classification problems in Machine Learning. SVM
  • 3.
  • 4.
  • 5.
    KEY TERMS • Hyperplane: •SVM identifies the optimal hyperplane that maximizes the margin between classes in the feature space. • Especially useful for scenario where linear separation is not feasible. • Support Vector Machine: • Support Vector are the data points closets to the hyperplane. • They play a pivotal role in determining the position and orientation of the optimal hyperplane.
  • 6.
    Margin: Margin is distancefrom the decision surface to the closest data point Positive hyperplane + Negative Hyperplane=Margin Suppose D1+D2=M
  • 7.
    GOAL The goal ofthe SVM algorithm is to create the best line or decision boundary that can segregate n dimensional space into classes so that we can easily put the new data point in the correct category in the future. This best decision boundary is called a hyperplane.
  • 8.
    Types of SVM LinearSVM: Linear SVM is used for linearly separable data, which means if a dataset can be classified into two classes by using a single straight line, then such data is termed as linearly separable data, and classifier is used called as Linear SVM classifier.
  • 9.
    Non-linear SVM Non-Linear SVMis used for non-linearly separated data, which means if a dataset cannot be classified by using a straight line, then such data is termed as nonlinear data and classifier used is called as Non-linear SVM classifier
  • 10.
    Kernal Trick • SVMemploys the Kernal trick to handle no-linear relationships in data by mapping it into a higher dimensional space. • Enables SVM to capture complex patterns and make accurate prediction.
  • 12.
    Drawback • Not suitablefor large datasets • Large training time • More features, more complexities • Bad performance on high noise • Choice of kernel • Computationally expensive