We will understand basic overview of deep learning.
understanding specific types of neural networks
Recurrent neural network i.e RNN
LSTM: Long short term memory unit.
which is going to be useful when using it to generate text from source corpus.
So later on in this space, we will read source material like a novel and then use our neural networks to generate brand new text in the same style as the original source corpus. We will also learn how to create Q/A chatbots with python and RNN.
Introduction to perceptron(artificial neuron).
Before we understand neural networks , we need to understand the individual components first such as single neuron .Artificial neural networks or ANNs actually have a basis in biology.
So in this post let's discuss 3 major thing:
Biological Neuron
Perceptron Model
Mathematical Representation
let's see how we can attempt to mimic biological neuron with an artificial neuron, known as perceptron.Let's see how a biological neuron such as brain cell works?
Biological Neuron:
Electrical signal gets passed through these dendrites to the body of the cell. and then later on a single electrical signal is passed on through an axon to later connect to some other neuron
this red/orange part is axon.
Similar to the picture this is a simple model, Also known as Perceptron with 2 inputs and 1 output. Input can have value of features, these features can be anything from
how many rooms a house has
how dark an image is
now let's assign some values to these input
12
Input 0
\
\
\
******
********. ------------- Output
******
/
/
/
Input 1
4
next step, have these input be multiplied by some sort of weight.
So we have weight 0 for input 0, and weight 1 for input 1. Typically these weights are actually initialised through some sort of randomgeneration.
12
Input 0
\
\ 0.5 (weight 0)
\
******
********. ------------- Output
******
/
/ -1 (weight 1)
/
Input 1
4
Now these inputs are multiplied by weights .
12
Input 0
\
\ 0.5 (weight 0) = 12 * 0.5 = 6
\
******
****Activation Function ****. ------------- Output
******
/
/ -1 (weight 1) = 4 * -1 = - 4
/
Input 1
Now we should pass the product into activation function .
Activation Function () {
if sum of the inputs > 0 return 1 else return 0
}
What is the original input was 0?
If input happens to be 0 then you will always get 0, no matter what the weight is.
So we are adding a biased term, which is 1.
12
Input 0
\
\ 0.5 (weight 0) = 12 * 0.5 = 6
\
+1 ******
Bias --------- ------- ****Activation Function ****. ------------- Output
******
/
/ -1 (weight 1) = 4 * -1 = - 4
/
Input 1
so what does it looks like Mathematically?
How can we present our perceptron model mathematically?
from i = 0 to n, = W i * X i + b
So once we have many perceptrons in a network, we'll see how we can extend this to matrix form.
Deep learning for NLP.
So later on in this space, we will read source material like a novel and then use our neural networks to generate brand new text in the same style as the original source corpus. We will also learn how to create Q/A chatbots with python and RNN.
Introduction to perceptron(artificial neuron).
Before we understand neural networks , we need to understand the individual components first such as single neuron .Artificial neural networks or ANNs actually have a basis in biology.
So in this post let's discuss 3 major thing:
let's see how we can attempt to mimic biological neuron with an artificial neuron, known as perceptron.Let's see how a biological neuron such as brain cell works?
Biological Neuron:
Electrical signal gets passed through these dendrites to the body of the cell. and then later on a single electrical signal is passed on through an axon to later connect to some other neuron
this red/orange part is axon.
Similar to the picture this is a simple model, Also known as Perceptron with 2 inputs and 1 output. Input can have value of features, these features can be anything from
now let's assign some values to these input
12
Input 0
\
\
\
******
********. ------------- Output
******
/
/
/
Input 1
4
next step, have these input be multiplied by some sort of weight.
So we have weight 0 for input 0, and weight 1 for input 1. Typically these weights are actually initialised through some sort of random generation.
12
Input 0
\
\ 0.5 (weight 0)
\
******
********. ------------- Output
******
/
/ -1 (weight 1)
/
Input 1
4
Now these inputs are multiplied by weights .
12
Input 0
\
\ 0.5 (weight 0) = 12 * 0.5 = 6
\
******
****Activation Function ****. ------------- Output
******
/
/ -1 (weight 1) = 4 * -1 = - 4
/
Input 1
Now we should pass the product into activation function .
What is the original input was 0?
If input happens to be 0 then you will always get 0, no matter what the weight is.
So we are adding a biased term, which is 1.
12
Input 0
\
\ 0.5 (weight 0) = 12 * 0.5 = 6
\
+1 ******
Bias --------- ------- ****Activation Function ****. ------------- Output
******
/
/ -1 (weight 1) = 4 * -1 = - 4
/
Input 1
so what does it looks like Mathematically?
How can we present our perceptron model mathematically?
from i = 0 to n, = W i * X i + b
So once we have many perceptrons in a network, we'll see how we can extend this to matrix form.