BERT is one of the most popular algorithms in the NLP spectrum known for producing state-of-the-art results in a variety of language modeling tasks.
Built on top of transformers and seq-to-sequence models, the Bidirectional Encoder Representations from Transformers is a powerful NLP modeling technique that sits at the cutting edge.
Here’s a great write up on how to build a BERT classifier model in TF 2.0.
The success of BERT has not only made it the power behind the top search engine known to mankind but also has inspired and paved the way for many new and better models. Given below are some of the popular NLP models and algorithms which were inspired by BERT: