Positional Encoding

A mechanism that injects information about token position into transformer inputs, since self-attention is permutation-invariant and cannot distinguish token order. Sinusoidal and learned absolute encodings are common; rotary positional encoding (RoPE) is the modern standard. In robot learning, positional encodings also encode temporal position within action sequences.

MLTransformer

Explore More Terms

Browse the full robotics glossary.

Back to Glossary