Wednesday 3 January 2018

What Is Artificial Neural Network And How It Works?

Introduction:

In order to understand the topic of the day, we need to firstly understand what a neural network means? The term Neural hails from the name of the nervous system basic unit called the ‘neuron’ and hence a network of such is called a Neural Network. That would be the network of neurons in a human brain, but what if the same power is imbibed into an artificial set of things which can simulate the same behavior – that’s the advent of Artificial Intelligence Neural Networks.
Artificial Neural Networks, ANN for short, have become pretty famous and is also considered the hot topic of interest and finds its application in chat-bots that are often used in the text classification. Being true to yourself, if and only if you are a neuroscientist, the analogy of using the brain isn’t going to illustrate much. Software analogies to synapses and neurons in animal brain have been on the rise while the neural networks in software industry has already been in the industry for decades.

What does artificial neural network mean?

Artificial Neural Networks can be best described as the biologically inspired simulations that are performed on the computer to do a certain specific set of tasks like clustering, classification, pattern recognition etc. In general, Artificial Neural Networks is a biologically inspired network of neurons (which are artificial in nature) configured to perform a specific set of tasks.

How does artificial neural networks work?

Artificial Neural Networks can be best viewed as weighted directed graphs, where the nodes are formed by the artificial neurons and the connection between the neuron outputs and neuron inputs can be represented by the directed edges with weights. The Artificial Neural Network receives the input signal from the external world in the form of a pattern and image in the form of a vector. These inputs are then mathematically designated by the notations x(n) for every n number of inputs.
Each of the input is then multiplied by its corresponding weights (these weights are the details used by the artificial neural networks to solve a certain problem). In general terms, these weights typically represent the strength of the interconnection amongst neurons inside the artificial neural network. All the weighted inputs are summed up inside the computing unit (yet another artificial neuron).
If the weighted sum equates to zero, a bias is added to make the output non-zero or else to scale up to the system’s response. Bias has the weight and the input to it is always equal to 1. Here the sum of weighted inputs can be in the range of 0 to positive infinity. To keep the response in the limits of a desired value, a certain threshold value is benchmarked. And then the sum of weighted inputs is passed through the activation function.
The activation function in general is the set of transfer functions used to get the desired output of it. There are various flavors of the activation function, but mainly either linear or non-linear sets of functions. Some of the most commonly used set of activation functions are the Binary, Sigmoidal (linear) and Tan hyperbolic sigmoidal (non-linear) activation functions. Now let us take a look at each of them, to certain detail:

Binary:

The output of the binary activation function is either a 0 or a 1. To attain this, there is a threshold value set up. If the net weighted input of the neuron is greater than 1 then the final output of the activation function is returned as 1 or else the output is returned as 0.

Sigmoidal Hyperbolic:

The Sigmoidal Hyperbola function in general terms is an ‘S’ shaped curve. Here tan hyperbolic function is used to approximate output from the actual net input. The function is thus defined as:
f (x) = (1/1+ exp(-????x))
where ?????is considered the?steepness parameter.
If you want more visit Mindmajix  

Explore more courses visit mindmajix

No comments:

Post a Comment