Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Getting Started with Google BERT

You're reading from   Getting Started with Google BERT Build and train state-of-the-art natural language processing models using BERT

Arrow left icon
Product type Paperback
Published in Jan 2021
Publisher Packt
ISBN-13 9781838821593
Length 352 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Sudharsan Ravichandiran Sudharsan Ravichandiran
Author Profile Icon Sudharsan Ravichandiran
Sudharsan Ravichandiran
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. Section 1 - Starting Off with BERT
2. A Primer on Transformers FREE CHAPTER 3. Understanding the BERT Model 4. Getting Hands-On with BERT 5. Section 2 - Exploring BERT Variants
6. BERT Variants I - ALBERT, RoBERTa, ELECTRA, and SpanBERT 7. BERT Variants II - Based on Knowledge Distillation 8. Section 3 - Applications of BERT
9. Exploring BERTSUM for Text Summarization 10. Applying BERT to Other Languages 11. Exploring Sentence and Domain-Specific BERT 12. Working with VideoBERT, BART, and More 13. Assessments 14. Other Books You May Enjoy

Language-specific BERT

In the previous sections, we learned how M-BERT works. We learned how M-BERT is used in many different languages. However, instead of having a single M-BERT model for many languages, can we train a monolingual BERT for a specific target language? We can, and that is precisely what we will learn in this section. We will look into several interesting and popular monolingual BERT models for various languages, as indicated here:

  • FlauBERT for French
  • BETO for Spanish
  • BERTje for Dutch
  • German BERT
  • Chinese BERT
  • Japanese BERT
  • FinBERT for Finnish
  • UmBERTo for Italian
  • BERTimbay for Portuguese
  • RuBERT for Russian

FlauBERT for French

FlauBERT, which stands for French Language Understanding via BERT, is a pre-trained BERT model for the French language. The FlauBERT model performs better than the multilingual and cross-lingual models on many downstream French NLP tasks.

FlauBERT is trained on a huge heterogeneous French corpus. The French corpus consists of 24 sub-corpora containing...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image