KALINGA INSTITUTE OF INDUSTRIAL TECHNOLOGY
Deemed to be University
BHUBANESWAR-751024
School of Computer Engineering
Coordinator: Mr.Sohail Khan (sohail.khanfcs@kiit.ac.in)
Co-coordinator: Dr. Partha Pratim Sarangi
(pp.sarangifcs@kiit.ac.in)
Course: CS 3035 (Machine Learning)
Credits: 3
Session: January to April
Course Objective
1. To provide a broad survey of different machine-learning
approaches and techniques
2. To understand the principles and concepts of machine
learning
3. To understand neural network concepts
4. To learn regression and reinforcement learning
5. To develop programming skills that help to build
real-world applications based on machine learning
Course Outcomes
Upon completion of the course, the students will be able to:
CO1: Solve typical machine learning problems
CO2: Compare and contrast different data representations to
facilitate learning
CO3: Apply the concept of regression methods, classification
methods, and clustering methods.
CO4: Suggest supervised /unsupervised machine learning
approaches for any application
CO5: Implement algorithms using machine learning tools
CO6: Design and implement various machine learning algorithms
in a range of real-world applications.
Lesson Plan
Total Lectures ≈ 42
Before mid-sem ≈ 21
After mid-sem ≈ 21
Classes Left = 1,2,3
Module 1
Lecture Topics
1 Introduction to Machine Learning, definition, and
real-world applications.
2 Type of machine learning - Supervised, Unsupervised,
Reinforced learning, Definitions and examples.
3 Regression - Linear Regression, Intuition, Cost
Function
4 Linear Regression - Gradient Descent, Multiple
Linear regression
5 Closed-form Equation, Type of Gradient Descent
(Batch, Stochastic, Mini-batch) - Definition, properties.
6 Normalization and Standardization (definition and
why), Overfitting and Underfitting
7 Bias, Variance, Bias and Variance tradeoff
8 Regularization - Lasso Regularization, Ridge
Regularization
Module 2
Lecture Topics
9 Classification, Logistic Regression - 1 (binary)
10 Logistic Regression - 2 (binary)
11 Nearest neighbor and K Nearest Neighbour
12 Error Analysis - Train/Test Split, validation set,
Accuracy, Precision, Recall, F-measure, ROC curve,
Confusion Matrix
13 Naive Bayes Classifier
14 Decision Tree Introduction, Id3 Algorithm - 1
15 Decision Tree - Id3 Algorithm - 2
16 Decision Tree - Problem of Overfitting,
Pre-pruning/post-pruning Decision Tree, Examples.
17 Random Forest, Ensemble Learning (bagging, boosting)
Mid Semester
1 Support Vector Machine - 1
2 Support Vector Machine - 2
3 Principal Component Analysis - Steps, merits,
demerits.
Module 3
Lecture Topics
4 Introduction Clustering, K-means Clustering - 1
5 Hierarchical Clustering - Agglomerative
Clustering, Single/Complete/Average/Centroid
Linkage
6 Hierarchical Clustering - Divisive hierarchical
clustering
Module 4
Lecture Topics
7 to 12 Introduction Neural networks, McCulloch-Pitts Neuron
Perceptron Model
Feed Forward Neural Networks
Multilayer Networks and Hidden layer representation
Non-linear problem solving, Activation Functions
Backpropagation Algorithm - 1
Backpropagation Algorithm - 2
Exploding Gradient Problem and Vanishing Gradient
Problem, why and how to avoid
Residual Neural Network (how they avoid vanishing
gradient)
Module 5
Topics
Awareness of machine learning tools like Scikit Learn,
PyTorch, TensorFlow, Kaggle competitions, etc. - Give as an
assignment, no question will be asked in end semester.
Activities
Task Marks
Before Mid-semester
Assignment 5
Quiz 5
Coding Assignment 5
After Mid-semester
Assignment 5
Quiz 5
Coding Assignment 5
Textbooks:
1. Kevin P. Murphy, “Probabilistic Machine Learning”, The MIT
Press, 2023.
2. Ethem Alpaydin, “Introduction to Machine Learning”, Fourth
Edition, MIT Press, 2010.
Reference Books:
1. Laurene Fausett, “Fundamentals of Neural Networks,
Architectures, Algorithms and Applications”,
PearsonEducation, 2008.
2. C. M. Bishop, “Pattern Recognition and Machine Learning”,
Springer, 2007.
3. Simon Haykin, “Neural Networks and Learning Machines”,
Pearson 2008