word2vec

Word2Vec is a technique in natural language processing for learning word embeddings by training a neural network model to predict the context of a word in a given text. The model captures semantic relationships between words based on their co-occurrence in a large corpus of text, allowing for the representation of words as dense vectors in a continuous vector space.

Requires login.