Second Year CORE COURSE VIII Semester III
MACHINE LEARNING TECHNIQUES
Code: (Theory) Credit: 5
COURSE OBJECTIVES:
To Learn about Machine Intelligence and Machine Learning applications
To understand the theoretical and practical aspects of Probabilistic Graphical Models
To understand how to perform evaluation of learning algorithms and model selection
UNIT – I INTRODUCTION:
Machine Learning - Machine Learning Foundations –Overview – Design of a Learning
system - Types of machine learning –Applications Mathematical foundations of machine
learning - random variables and probabilities - Probability Theory – Probability distributions
-Decision Theory- Bayes Decision Theory - Information Theory
UNIT – II SUPERVISED LEARNING:
Linear Models for Regression - Linear Models for Classification – Naïve Bayes Discriminant
Functions -Probabilistic Generative Models –Probabilistic Discriminative Models - Bayesian
Logistic Regression. Decision Trees - Classification Trees- egression Trees - Pruning. Neural
Networks -Feed-forward Network Functions - Back- propagation. Support vector machines –
Ensemble methods- Bagging- Boosting
UNIT – III UNSUPERVISED LEARNING:
Clustering- K-means - EM Algorithm- Mixtures of Gaussians. The Curse of Dimensionality
Reduction - Factor analysis - Principal Component Analysis - Probabilistic PCA-
Independent components analysis
UNIT – IV PROBABILISTIC GRAPHICAL MODELS:
Graphical Models - Undirected graphical models - Markov Random Fields -Directed
Graphical Models -Bayesian Networks - Conditional independence properties - Inference –
Learning- Generalization - Hidden Markov Models - Conditional random fields(CRFs)
UNIT – V ADVANCED LEARNING:
Sampling –Basic sampling methods – Monte Carlo. Reinforcement Learning- K- Armed
Bandit-Elements - Model-Based Learning- Value Iteration- Policy Iteration. Temporal
Difference Learning- Exploration Strategies- Deterministic and Non- deterministic Rewards
and Actions Computational Learning Theory – Mistake bound analysis, sample complexity
analysis, VC dimension. Occam learning, accuracy and confidence boosting
UNIT – VI CURRENT CONTOURS (For continuous internal assessment only):
Contemporary Developments Related to the Course during the Semester Concerned
REFERENCES:
1. Christopher Bishop, “Pattern Recognition and Machine Learning” Springer, 2007.
2. Kevin P. Murphy, “Machine Learning: A Probabilistic Perspective”, MIT Press, 2012.
3. EthemAlpaydin, “Introduction to Machine Learning”, MIT Press, Third Edition, 2014.
4. Tom Mitchell, "MachineLearning", McGraw-Hill, 1997.
5. Trevor Hastie, Robert Tibshirani, Jerome Friedman, "The Elements of
Statistical Learning", Springer, Second Edition, 2011.
6. Stephen Marsland, “Machine Learning - An Algorithmic Perspective”,
Chapman and Hall/CRC Press, Second Edition, 2014.
7. Demystifying Machine Learning, Neural Networks and Deep Learning By
Suresh Samudrala · 2019, Notion Press
8. Machine Learning, By Rajiv Chopra KHANNA PUBLISHING HOUSE,2020
9. https://data-flair.training/blogs/machine-learning-tutorial/
10. https://www.cs.ubc.ca/~murphyk/Bayes/bnintro.html
11. https://www.geeksforgeeks.org/machine-learning/
COURSE OUTCOMES:
At the end of the course, the students will be able to:
Have a good understanding of the fundamental issues and challenges of machine learning:
Have an understanding of the strengths and weaknesses of many popular machine learning
approaches.
Be able to design and implement various machine learning algorithms in a range of real-
world applications.
Use a tool to implement typical clustering algorithms for different types of applications
Design and implement an HMM for a sequence model type of application
*****
https://www.javatpoint.com/machine-learning
https://www.geeksforgeeks.org/introduction-machine-learning/