Cross-Entropy Loss

A loss function measuring the difference between predicted probability distributions and true labels. For classification, it is the negative log-likelihood of the correct class. In robot learning, cross-entropy is used for discrete action prediction, token generation in VLA models, and training language-conditioned policies where the action space is discretized.

MLTraining

Explore More Terms

Browse the full robotics glossary.

Back to Glossary