KEMBAR78
Neural Network with Python & TensorFlow | PPTX
Neural Network
Count Chu
Topic
‱ Simple Classifier
– Learning Rate
‱ Neural Node
– Error Function
– Learning Rate
– Sigmoid Function
‱ Neural Network
– 3-Layer
– N-Layer
– Back-Propagating Errors
– Slop of Error Function
– Gradient Descent
‱ Implement
– DIY with Python
– TensorFlow – Gradient Descent
‱ Resources
‒ TensorFlow
‒ http://countchu2.blogspot.com
‒ http://blog.csdn.net/countchu/article
Training A Simple Classifier
Apply Learning Rate in Classifier
Neural Node
Transform the Classifier Problem
into a Neural Network.
Error Function
E = t - y
E = (t1 - y1) + (t2 - y2)
y = f (w * x)
y = w * x
Apply Learning Rate in a Neural Node
Sigmoid Function
y = Sigmoid (Sum (w * x))
Output in 3-Layer Neural Network
Output in N-Layer Neural Network
Back-Propagating Errors in Neural Network
e = t - y
Slop of Error Function in Neural Network
Gradient Descent in Neural Network
How to implement 3-Layer Neural Network
DIY with Python
DIY with Python
Tensor Flow – Gradient Descent

Neural Network with Python & TensorFlow