Birla Institute of Technology & Science, Pilani
Work Integrated Learning Programmes Division
Digital Learning Handout
Part A: Content Design
Course Title Machine Learning for Electronics Engineers
Course No(s) VLM ZG513
Credit Units 4
Content Author Prof Meetha Shenoy
Instructor-In-Charge Nilanjan Mukherjee
Course description: Machine Learning approaches - supervised, unsupervised, semi-supervised, and
reinforcement learning. Multi-Layer Perceptron, Convolutional Neural Networks, Recurrent Neural
Networks, Generative Deep learning, Deep learning techniques and their application to various types of
electronic systems/subsystems such as control-dominated systems, NLP systems, vision-based systems,
communication systems, embedded systems and IoT systems. Multi-modal and Multi-task learning,
Transfer learning, challenges in the implementation of ML techniques, complexity analysis of the ML
architectures for hardware implementation, efficient architectures/ topologies for ML implementation,
Hardware Platforms, Tools, and Software Packages for ML.
Course Objectives:
CO1 Understand and differentiate between various machine learning paradigms including
supervised, unsupervised, semi-supervised, and reinforcement learning, and their theoretical
foundations.
CO2 Explore deep learning architectures such as Multi-Layer Perceptrons (MLPs), Convolutional
Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and Generative Models, and
evaluate their applicability to real-world problems.
CO3 Analyze the integration of machine learning techniques in diverse electronic systems and
subsystems including control systems, vision systems, NLP systems, communication
networks, embedded platforms, and IoT devices.
CO4 Apply advanced learning paradigms such as multi-modal learning, multi-task learning, and
transfer learning, and address implementation challenges including computational complexity
and system constraints.
CO5 Design and evaluate hardware-efficient ML architectures, considering complexity analysis,
topology optimization, and suitability for deployment on hardware platforms using appropriate
tools and software packages.
Learning Outcomes
LO1 Classify and compare various machine learning approaches—supervised, unsupervised, semi-
supervised, and reinforcement learning—based on their underlying algorithms and use cases.
LO2 Design and implement deep learning models such as MLPs, CNNs, RNNs, and generative
networks for specific tasks in NLP, vision, and control systems.
LO3 Analyze the role of ML techniques in electronic and embedded systems, including IoT and
communication subsystems, and evaluate their performance metrics.
LO4 Apply advanced ML strategies like transfer learning, multi-modal learning, and multi-task
learning to improve model generalization and task efficiency in resource-constrained
environments.
LO5 Evaluate and optimize ML architectures for hardware implementation by performing
complexity analysis and selecting suitable tools, platforms (e.g., FPGA, GPU), and software
frameworks.
Text Book(s)
T1 I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, MIT Press, 2016.
T2 C. M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.
Reference Book(s)
R1 S. Shalev-Shwartz, S. Ben-David, Understanding Machine Learning: From Theory to
Algorithms, CUP, 2014.
R2 M. Mohri, A. Rostamizadeh, A. Talwalkar, Foundations of Machine Learning, 2nd Edition,
MIT Press, 2018.
R3 G. Rebala, A. Ravi, S. Churiwala, An Introduction to Machine Learning, Springer, 2019.
R4 C. C. Aggarwal, Neural Networks and Deep Learning, Springer, 2018.
R5 U. Kamath, J. Liu, J. Whitaker, Deep Learning for NLP and Speech Recognition, Springer,
2019.
R6 S. Khan, H. Rahmani, S. A. A. Shah, M. Bennamoun, A Guide to Convolutional Neural
Networks for Computer Vision, Morgan & Claypool, 2018.
R7 J. Krohn, Deep Learning Illustrated, Pearson/AW, 2020.
R8 L. Deng and D. Yu, Deep Learning : Methods and Applications, Microsoft Technical Report
Series, 2014.
R9 Research Papers Published Journals and Conferences.
R10 Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong. ,Mathematics for Machine
Learning, Cambridge University Press.
Part B: Learning Plan
Cont List of Sub-Topics Referen
act Topics ce
Hour
Introduction to the course, Motivation, ML Categories
1. T1, R1,
Introduct Supervised Learning: Regression, Classification, Unsupervised R2, R3,
2. ion to Learning: K-means, EM R4
Machine
Data types, Data modeling, feature selection, transformation,
3. Learning
dimensionality reduction
Basics: Universal Approximation, Theorem, Activation functions,
4. Supervised Training (Input Space, Samples, T1, R1,
Basics of Normalization/Denormalization, Forward Propagation, R2, R3,
Neural Loss Functions, Error Computation, Backpropagation, Parameter R4
5. Networks Update, Generalization). Model Training (Early stopping,
Regularization, Hyperparameter Selection, Evaluation Metrics Model
Performance,) Application: Neural Network based Adaptive PID
Controller
Convolut Introduction and Building Blocks of CNN, Convolution and Co-
6 relation, Convolution Types, Activation Functions-Types T1, T2
ional
Pooling Layer Types and Subsampling, Local Connectivity or Sparse publishe
Neural d papers
Interactions, Parameter Sharing, Spatial Arrangement, Classical CNN
Networks
Architecture
CNN Learning and Training, Weight Initialization, Regularization of
7 CNN, Gradient Computation and Gradient-based Learning,
Optimizers, Reusing Pretrained Layers
Application: Object Detection
Review Session
8
Sequence Data and Modelling, Discrete-time Markov Chains
9. Hidden Markov Model (Discriminative Approach), Conditional T1, T2,
R5, R6,
Recurren Random Fields (Generative Approach) R7, R8,
Introduction, Building blocks of RNN, Recurrence and Memory,
t Neural Publish
Forward and Backward Propagation in RNNs
Networks Vanishing Gradient Problem & Regularization ed
RNN Architectures, Classical Deep RNN, Bidirectional RNN Papers
10. Residual LSTM, Recurrent Highway Networks, SRU and Quasi RNN,
Recursive Neural Network
Generati Energy-Based Models (Introduction), Boltzmann & Restricted
11. ve Deep Boltzman Models, Deep Belief Networks, Autoencoders, Variational T1, R1,
Learning Encoder Sparse Coding (Introduction), Generative Adversarial R2, R3,
Networks (GANs) R4
Choosing hardware architecture for Machine/Deep Learning: FPGA,
12. GPU, Microcontrollers, MPSoC; Embedded System Considerations, R5, R6,
ML Complexity Analysis of DNN, Performance Tradeoffs: Memory, R7, R8,
13 Hardwar Latency, Throughput, Power Consumption, Efficient Topologies, R9,
e Quantization and Accuracy Tradeoffs, Pruning (Memory and Energy- Publish
Architect aware) ed
ures Papers
Transfer Learning: Introduction, Applications, Deep Reinforcement
14 Learning: Introduction, Applications T2,
Advance Multi-Modal and Multi-Task Learning, Handling Multiple Modalities R8,R9
15. d Topics
e.g. Text and Image, Text and Speech, Audio and Video, Multi-Task
Learning e.g. Speech, NLP, or Image Domains, Current Status and
Future Trends
16. Review Session - 2
Experiential Learning Components:
Suggested Applications to be discussed for each module ( may be covered in Lecture/Lab or in both):
Modules 1 and 2: Neural Network based Adaptive PID Controller
Module 3: Image Object Detection
Module 4:
a. Application based on CNN/RNN: e.g. Channel Prediction, Sensor Fusion (any application
domain- including IoT data handling)
b. Application covering correlation between data samples (any application domain): e.g. Word
Classification, Sentence Classification, Sentiment Polarity, Video Analysis, IoT Systems/
Embedded System for various applications such as energy management, smart city monitoring
etc.
c. Hybrid System from any application domain: e.g. DNN-HMM model for speech recognition
d. Deep learning for PHY layer design, Resource Allocation, Adaptive modulation for
wired/wireless systems, Video Analysis (Note that given applications are for illustration
purposes; They can vary and may be covered in Lecture/Lab or in both), VLSI circuit/system
design
Evaluation Scheme
Legend: EC = Evaluation Component; AN = After Noon Session; FN = Fore Noon Session
No Name Type Durati Weig Day, Date, Session, Time
on ht
EC- Quiz Online 1 week 10 % September 01-10, 2025
1*
Assignment/ Online 10 20 %
Lab days November 01-10, 2025
Assignment
EC-2 Mid-Semester Test Clos 2 30 % 20/09/2025 (EN)
ed hours
Boo
k
EC-3 Comprehensive Exam Open 2½ 40 % 29/11/2025 (EN)
Book hours
EC1* (20% - 30%): Quiz (optional): 5-10 %, Lab Assignment/Assignment: 20% - 30%
Syllabus for Mid-Semester Test (Closed Book): Topics in Session Nos. 1 to 8
Syllabus for Comprehensive Exam (Open Book): All topics (Session Nos. 1
to 16) Important links and information:
Elearn portal: https://elearn.bits-pilani.ac.in
Students are expected to visit the Elearn portal on a regular basis and stay up to date with the latest
announcements and deadlines.
Contact sessions: Students should attend the online lectures as per the schedule provided on the
Elearn portal.
Evaluation Guidelines:
1. EC-1 consists of either two Assignments or three Quizzes. Students will attempt them
through the course pages on the Elearn portal. Announcements will be made on the portal,
in a timely manner.
2. For Closed Book tests: No books or reference material of any kind will be permitted.
3. For Open Book exams: Use of books and any printed / written reference material (filed or
bound) is permitted. However, loose sheets of paper will not be allowed. Use of calculators
is permitted in all exams. Laptops/Mobiles of any kind are not allowed. Exchange of any
material is not allowed.
4. If a student is unable to appear for the Regular Test/Exam due to genuine exigencies, the
student should follow the procedure to apply for the Make-Up Test/Exam which will be
made available on the Elearn portal. The Make-Up Test/Exam will be conducted only at
selected exam centres on the dates to be announced later.
It shall be the responsibility of the individual student to be regular in maintaining the self study
schedule as given in the course handout, attend the online lectures, and take all the prescribed
evaluation components such as Assignment/Quiz, Mid-Semester Test and Comprehensive Exam
according to the evaluation scheme provided in the handout.
*********