Dropout

A regularization technique that randomly sets a fraction of neural network activations to zero during training, preventing co-adaptation of neurons. Dropout reduces overfitting and improves generalization. At inference time, all neurons are active but scaled. Dropout is commonly applied in policy networks to prevent overfitting to limited robot demonstration data.

MLTraining

Explore More Terms

Browse the full robotics glossary.

Back to Glossary