self-attention
Self-attention refers to a mechanism in natural language processing and machine learning where a sequence of inputs or embeddings interacts and weighs its own elements in order to determine their importance and relationships, enabling the model to focus on relevant information during processing.
Requires login.
Related Concepts (17)
- attention layers
- attention mechanism
- attention-based models
- deep learning
- image recognition
- information retrieval
- language understanding
- machine learning
- music generation
- natural language processing
- neural networks
- recommender systems
- reinforcement learning
- sequence modeling
- transformer architecture
- transformer models
- video analysis
Similar Concepts
- attention and consciousness
- attention regulation
- attentional control
- attentional focus
- attentiveness
- auditory attention
- embodied attention
- focus and attention
- perception and attention
- selective attention
- self-awareness
- self-observation
- self-recognition
- self-reflection
- self-reflection and self-awareness