KEMBAR78
Deep Learning Notes | PDF | Artificial Neural Network | Neuroscience
0% found this document useful (0 votes)
11 views2 pages

Deep Learning Notes

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views2 pages

Deep Learning Notes

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

What are Deep Boltzmann Machines (DBMs)?

Deep Boltzmann Machines (DBMs) are a kind of artificial neural network that belongs to the family of
generative models

They are designed to discover intricate structures within large datasets by learning to recreate the input
data they’re given.

Think of a DBM as an artist who, after studying a collection of paintings, learns to creDBMs consist of
multiple layers of hidden units, which are like the neurons in our brains. These units work together to
capture the probabilities of various patterns within the data.

Unlike some other neural networks, all units in a DBM are connected across layers, but not within the
same layer, which allows them to create a web of relationships between different features in the data.
This structure helps DBMs to be good at understanding complex data like images, text, or sound.ate new
artworks that could belong to the same

The ‘deep’ in the Deep Boltzmann Machine refers to the multiple layers in the network, which allow it to
build a deep understandable

Each layer captures increasingly abstract representations of the data.

1. The first layer might detect edges in an image,


2. the second layer might detect shapes, and the
3. third layer might detect whole objects like cars or trees.nding of the datae collection.

Mathematical concepts
The probability of a certain state (a combination of visible and hidden units) is given by the
Boltzmann distribution:

where Z is the partition function, a normalization factor that ensures all probabilities sum up to
one. It’s calculated as the sum of e^{-E(v,h)} over all possible states.
Several key concepts underpin Deep Boltzmann Machines:
 Energy-Based Models: DBMs are energy-based models, which means they assign an ‘energy’
level to each possible state of the network. States that are more likely have lower energy. The
network learns by finding states that minimize this energy.
 Stochastic Neurons: Neurons in a DBM are stochastic. Unlike in other types of neural networks,
where neurons output a deterministic value based on their input, DBM neurons make random
decisions about whether to activate.
 Unsupervised Learning: DBMs learn without labels. They look at the data and try to understand
the underlying structure without any guidance on what features are important.
 Pre-training: DBMs often go through a pre-training phase where they learn one layer at a time.
This step-by-step learning helps in stabilizing the learning process before fine-tuning the entire
network together.
 Fine-Tuning: After pre-training, DBMs are fine-tuned, which means they adjust all their
parameters at once to better model the data.

You might also like