What is Dropout in Neural Networks?
The term “dropout” refers to dropping out units (both hidden and visible) in a neural network.
Simply put, dropout refers to ignoring units (i.e. neurons) during the training phase of certain set of neurons which is chosen at random.
By “ignoring”, We mean these units are considered during a particular forward or backward pass.More technically, At each training stage, individual nodes are either dropped out of the net with probability 1-p or kept with probability p, so that a reduced network is left; incoming and outgoing edges to a dropped-out node are also removed.
Why do we need Dropout?
The answer to these questions is “to prevent over-fitting”.
A fully connected layer occupies most of the parameters, and hence, neurons develop co-dependency among each other during training which curbs the individual power of each neuron leading to over-fitting of training data.
Dropout in Tensorflow:
tf.nn.dropout
dropout( x, keep_prob, noise_shape=None, seed=None, name=None)
x: A tensor
keep_prob: A scalar Tensor with the same type as x. The probability that each element is kept.
noise_shape: A 1-D Tensor of type int32, representing the shape for randomly generated keep/drop flags.
seed: A Python integer. Used to create random seeds. See tf.set_random_seed for behavior.
name: A name for this operation (optional).