Posts

Showing posts from October, 2018

Activation Functions in Neural Networks

Image
What are Activation Functions ? Activation Function of a neuron defines the output of that neuron given a sets of inputs. They are biologically similar to the activities in our brain, where different neurons are activated by different stimuli. For example:    A cake will activate some set of neurons (something pleasant) in the brain whereas, a garbage can will activate some other set of neurons (something unpleasant). Activation Functions are really important for a Artificial Neural Network to learn and make sense of something really complicated and Non-linear complex functional mappings between the inputs and response variable.  They  introduce non-linear properties to our Network . Their main purpose is to convert a input signal of a node in a A-NN to an output signal.  That output signal now is used as a input in the next layer in the stack. Is it necessary to have an Activation function? The answer is YES!   If we do not apply a Activation function then the outp

Recurrent Neural Networks and LSTM explained

Image
What are Recurrent Neural Networks ? Recurrent Neural Networks are the state of the art algorithm for sequential data  This is because it is the first algorithm that remembers its input, due to an internal memory, which makes it perfectly suited for Machine Learning problems that involve sequential data.   In simple terms, they are networks with loops in them, allowing information to be saved. Here, A is the network, Xt is the input with output  ht.  Although, these loops makes the RNN kind of hard to interpret, but in reality it is very simple. This is how the RNN looks when we unroll them. A RNN can be thought of as multiple copies of the same network , each passing message to  the next. Because of their internal memory, RNN’s are able to remember important things about the input they received, which enables them to be very precise in predicting what’s coming next. This is the reason why they are the preferred algorithm for sequential data like time