Pytorch
1
Agenda
• Introduction
• Basics of Pytorch
o Relevant Packages
o Computation graph
o Tensors and its operations
• Deep learning computations
o Defining a network
o Training a network
o Auto Differentiation
• Experiment walk through
3
Build a Neural Network from scratch in Python
4
Build a Neural Network from scratch in Python
What if you want to change your network by adding an additional layer?
What is the amount of code rework to be done approximately?
5
Build a Neural Network using Pytorch
What if you want to change your network by adding an additional layer?
What is the amount of code rework to done approximately?
6
Why Deep Learning Libraries?
• Easily build big computational graphs
• Easily compute gradients in computational graphs
• Run it all efficiently on GPU
Why Pytorch?
• PyTorch provides two high-level features
o Tensor computation with strong GPU acceleration
o Deep Neural Networks built on autodiff system
• Easy to implement, code, and debug
• More flexible due to its dynamic computational graph.
8
Neural networks library deeply integrated with autograd
PyTorch designed for maximumflexibility
• Important Packages:
a Tensor library like NumPy, with strong GPUsupport
torch. torch.
torch torch.nn
autograd optim
…
Auto differentiation library that Optimization package with standard optimization
supports all differentiable Tensor operations in methods such as SGD,RMSProp, LBFGS,Adametc.
torch
What is computational graph?
A computational graph is a way to represent a math function as a graph of
multiple tiny mathematical operations that leads to the same output
10
Basics of Pytorch
11
Tensor
• Tensors are similar to numpy’s ndarrays
• With the addition being that Tensors can also be used on a
GPU to accelerate computing
• Supports all the data types in numpy
• requires_grad
12
Basic operations
Initialize a tensor using a list
Initialize a 0-d tensor from scalar
Convert a 0-d tensor to python number
Always takes the datatype which as higher word size
Can specify the datatype while initializing
Just like numpy, creating a tensor by copying another
tensor results to pointing to same memory location
13
Basic operations
Generates a random tensor
Observe that for all the 3 operations
the outputs are same
14
Basic operations
Reshaping a tensor
Numpy -torch bridge
16
More Details
• https://pytorch.org/docs/stable/tensors.html
17
Define a Network
torch.nn
18
NN package
• Higher-level wrappers for working with neural nets
o Layers
o Activation Functions
o Loss functions
o Containers
• Gives flexibility to go beyond the prebuilt classes
19
MLP: Block Diagram
E.g. Cross Entropy Loss LOSS
Objective: Find out the best parameters
which will minimize the loss. Output Layer
Wn
Weight vector
Hidden Layers
W1
Input Layer
Label
Input 20
As Sequence of Layers
• Define our model as a
sequence of layers
21
As subclass of torch.nn.module
• Usually this approach is used to make user
defined/Complicated forward Propagations
possible
• Define a class with a method forward() (and a
constructor generally)
• __init__(): For defining / initializing the
necessary layers, parameters etc.
• forward(): Here we will define a flow of
input to output
22
Train a Network
23
Components Involved & Method
Network model
DataLoader
Loss Function
optimizer
24
Quick Glance into Ex4
• Expt-4
25
Auto Differentiation
26
Find the gradient of C w.r.t x,y,z
We can use Chain rule by Introducing intermediate
variables and by building a computational graph
28
Computational Graph: Numpy
Issues:
• No GPU Support
• Have to compute Gradients
on your Own
29
Computational Graph: Pytorch
Computation happens on
GPU, if available
Initialize a tensor with "requires_grad = True"
to Building a Computational Graph
Forward Pass
Backward to compute gradients
30
Numpy Pytorch Pytorch - GPU
31
PyTorch: Three Levels of Abstraction
• Tensor: Imperative ndarray, but runs on GPU
• Variable: Node in a computational graph; stores data and
gradient – “requires_grad = True”
• Module: A neural network layer; may store state or learnable
weights
32
Thanks!!
Questions?
34
Deep Learning Libraries
• Popular Libraries
PyTorch[Python] -- Facebook
Torch [Lua]
Theano [Python] -- University of Montreal
Caffe - UCBerkeley
TensorFlow – Google
Matconvnet – University of Oxford
Caffe 2
LeNet Torch Caffe PyTorch
1997 2009 2015 2017
1989 2002 2012 2016
LSTM Theano Tensor Caffe 2
Flow