Give me a one sentence summary of Word2Vec.
Word2Vec is a neural network algorithm that learns vector representations of words by predicting their context in a given corpus.
Give me a one sentence summary of BERT.
BERT is a transformer-based deep learning model that uses bidirectional training to pretrain contextualized representations of words and achieve state-of-the-art performance on a wide range of natural language processing tasks.
Do you know FinBERT?
Yes, FinBERT is a version of BERT that has been pre-trained on financial domain-specific data, such as news articles, financial reports and other financial documents, to improve its performance on financial NLP tasks.