Trending Technology Machine Learning, Artificial Intelligent, Block Chain, IoT, DevOps, Data Science

Recent Post

Codecademy Code Foundations

Search This Blog

What is Neural Network ? | Introduction to Neural Network | Python | TensorFlow | 2020

Neural Network Cover Theory Topic :
  • Neurons or Perceptron's
  • Activation Function
  • Cost Function
  • Gradient Descent
  • Backpropagation
Let's Get Started -

What is Perceptrons ?

  • Neuron is also known as Perceptron
  • Artificial Neural Network are based on natural biological systems.
  • Artificial Neural Network (ANN) is a software based approach to replicate these biological neurons.

The Biological Neuron

First are the dendrites :  think of these  as the terminals at which the neuron receives its inputs (real dendrites can do computations and have feedback control; this is not typically modeled an artificial neuron;

Next is the cell body; think of this as where any processing  occurs. Finally, the axon carries the output from the cell body to neighboring neurons and their dendrites. Basically the neuron is  a “simple” computational device: it receives input at its dendrites, does a computation at the cell body, and carries the output on its axon (I’ll just note here that a real neuron is much more complicated; for example the dendrites themselves can carry out some kinds of computation).

Biological neurons communicate across a synapse, which is a chemical “connection”, typically between an axon and dendrite. Signals flow from the axon terminal to receptors on the dendrite, mediated by the chemical state of both the axon and dendrite.

So what does learning mean in this setting? According to Hebbian theory (named after Canadian psychologist Donald Hebb), learning  is related to the increase in synaptic efficacy that arises from the presynaptic cell’s repeated and persistent stimulation of the postsynaptic cell.

 It is this increase in “communication” efficacy that we call learning.  One way to think about this is that the connection between the axon terminal and the dendrite is weighted, and the larger the weight (concentration of neurotransmitters in the axon terminal along with other chemical components including receptors in the dendrites) the more likely the neuron is to “fire”. 

So in this setting learning consists of optimizing neuron firings in certain patterns; to put it a different way, learning consists of optimizing the connection weights between axons and dendrites in a way that leads to some observed behavior. We will return to this idea of optimizing weights as learning when we talk about training Artificial Neural Networks.

The Artificial Neuron :

No comments:

Post a Comment

Popular Posts