Dropout is a novel approach to regularization that is particularly valuable for large and complex deep neural networks. For a much more detailed exploration of dropout in deep neural networks, see Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., and Salakhutdinav, R. (2014). The concept behind dropout is actually quite straightforward. During the training of the model, units (for example, input and hidden neurons) are probabilistically dropped along with all connections to and from them.
For example, the following diagram is an example of what might happen at each step of training for a model where hidden neurons and their connections are dropped with a probability ...