self-attention

Self-attention refers to a mechanism in natural language processing and machine learning where a sequence of inputs or embeddings interacts and weighs its own elements in order to determine their importance and relationships, enabling the model to focus on relevant information during processing.

Requires login.