KEMBAR78
AdvML Notes | PDF | Learning | Computational Neuroscience
0% found this document useful (0 votes)
3 views4 pages

AdvML Notes

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views4 pages

AdvML Notes

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

DEEP BOLTZAMANN MACHINES:

In a full boltzamann machine each node is connected to every other node and
hence the connects grows exponentially.

In deep boltzamann machines the DBMs are similar to deep belief networks
except that a part from the connections within layers the connections between
the layers are also undirected. DBMs can extract features and hence can be used
for mode complex tasks.

Constractive divergence: RBM adjust its regular weights by this method. Using
randomly assigned intial weights. RBM calculates the hidden nodes, which in turn
use the same weights to construct the input nodes.

Each hidden node is constructed from all the visible nodes


and each visible node is reconstructed from all the hidden node and hence the
input differenct from the reconstructed input, though the weights are same.

This process continues until the reconstructed input


matches the previous input. this process is said to be converged at this stage. This
entire procedure is known as "Gibbs Sampling".

Working and validating of DBNs:

DBNs works in two main phases:

1.Pre Training

2.Fine Training
Pre Training: In the pretraining phase the network learns to represent the
input data layer by layer. Each layer is trained independently as an RBM, which
allows the network to learn complex data representations efficiently. During this
phase, the networks learns the probability distribution of the inputs. which helps
it understanding the underlying structre of the data.

Fine Tuning: In this phase the DBN adjust its parameters fora specific task like
classifications or regressions. this is typically done using a technique known as
back propagation. where the network performance on the task is evaluated, and
the errors are used to update the networks parameters. This phase often involves
supervised learning, where the network is trained with labelled data .

Implementation of DBNs:

To implement DBNs

we need to install numpy pandas and scikit learn...

1.Import libraries

2.laod dataset.

3.preprocessing

4.RBM layer

5.Classifier layer

6.DBN pipeline

7.Training

8.Evaluation.
Composition of Neural Networks:
A Neural network is a fundamental component of deep learning, a sub field of AI .
it is a computational model inspired by structure and functioning of human brain.

Neurons:Neurons are the basic building blocks of a neural network. They receive
inputs, perform computations and produce outputs. each neuron is connected to
other neurons through weighted connections. They weights determine the
strength of the connecton and play a important role in the learning process.

Activation Functions:An activation function introduces non-linearity into


the neural network. It takes the weighted sum of inputs from the previous layer
and produces an output.

common activation functions include the sigmoid function tanh


function and rectified linear unit(ReLU)

Layers:A neural network is organized into layers which are composed of


multiple neurons. the input layer receives the input data, the output data
produces the final output. and hidden layers are between.

Hidden layers enable the network to learn complex patterns and representations.

Loss functions: The loss function measures the discrepancy between the
predicted output of the neural network and true output. It quantifies the errors
and provides a signal for the network to update its weights and biases.

Weights and bisases: Weights and bisases are parameters that determine
the behavior of a neural network. Each connection between neurons has an
associated weight, which controls the strength of the connection.
Biases are additional parameters added to each neuron allow
them to shift the activation function.

A Primer on Neural Networks: The primer of neural networks are


classified as:

1: Concept of Neuron

2.Network Layer

3.Different Neural Network Architecture.

You might also like