• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

sudharsan13296/Getting-Started-with-Google-BERT: Build and train state-of-the-ar ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称(OpenSource Name):

sudharsan13296/Getting-Started-with-Google-BERT

开源软件地址(OpenSource Url):

https://github.com/sudharsan13296/Getting-Started-with-Google-BERT

开源编程语言(OpenSource Language):

Jupyter Notebook 100.0%

开源软件介绍(OpenSource Introduction):

Getting started with Google BERT

Build and train state-of-the-art natural language processing models using BERT

About the book

Book Cover

BERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google's BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer's encoder and decoder work.

You'll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you'll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You'll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT.

The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you'll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT.

Get the book


Clone the repo and run in Google Colab

1. A Primer on Transformer

2. Understanding the BERT model

  • 2.1. Basic idea of BERT
  • 2.2. Working of BERT
  • 2.3. Configuration of BERT
  • 2.4. Pre-training the BERT
  • 2.5. Pre-training strategies
  • 2.6. Pre-training procedure
  • 2.7. Subword tokenization algorithms
  • 2.8. Byte pair encoding
  • 2.9. Byte-level byte pair encoding
  • 2.10. WordPiece

3. Getting hands-on with BERT

4. BERT variants I - ALBERT, RoBERTa, ELECTRA, SpanBERT

5. BERT variants II - Based on knowledge distillation

  • 5.1. Knowledge distillation
  • 5.2. DistilBERT - distilled version of BERT
  • 5.3. Training the DistilBERT
  • 5.4. TinyBERT
  • 5.5. Teacher-student architecture
  • 5.6. Training the TinyBERT
  • 5.7. Transferring knowledge from BERT to neural network
  • 5.8. Teacher-student architecture
  • 5.9. Training the student network
  • 5.10. Data augmentation method

6. Exploring BERTSUM for text summarization

  • 6.1. Text summarization
  • 6.2. Fine-tuning BERT for text summarization
  • 6.3. Extractive summarization using BERT
  • 6.4. Abstractive summarization using BERT
  • 6.5. Understanding ROUGE evaluation metric
  • 6.6. Performance of BERTSUM model
  • 6.7. Training the BERTSUM model

7. Applying BERT for other languages

8. Exploring Sentence and Domain Specific BERT

9. Understanding VideoBERT, BART, and more




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
ng2-ui/map: Angular Google Maps Directives发布时间:2022-06-11
下一篇:
googleapis/google-auth-library-python-oauthlib发布时间:2022-06-11
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap