• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

robrua/easy-bert: A Dead Simple BERT API for Python and Java (https://github.com ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称:

robrua/easy-bert

开源软件地址:

https://github.com/robrua/easy-bert

开源编程语言:

Java 62.2%

开源软件介绍:

MIT Licensed PyPI Maven Central JavaDocs DOI

easy-bert

easy-bert is a dead simple API for using Google's high quality BERT language model in Python and Java.

Currently, easy-bert is focused on getting embeddings from pre-trained BERT models in both Python and Java. Support for fine-tuning and pre-training in Python will be added in the future, as well as support for using easy-bert for other tasks besides getting embeddings.

Python

How To Get It

easy-bert is available on PyPI. You can install with pip install easybert or pip install git+https://github.com/robrua/easy-bert.git if you want the very latest.

Usage

You can use easy-bert with pre-trained BERT models from TensorFlow Hub or from local models in the TensorFlow saved model format.

To create a BERT embedder from a TensowFlow Hub model, simply instantiate a Bert object with the target tf-hub URL:

from easybert import Bert
bert = Bert("https://tfhub.dev/google/bert_multi_cased_L-12_H-768_A-12/1")

You can also load a local model in TensorFlow's saved model format using Bert.load:

from easybert import Bert
bert = Bert.load("/path/to/your/model/")

Once you have a BERT model loaded, you can get sequence embeddings using bert.embed:

x = bert.embed("A sequence")
y = bert.embed(["Multiple", "Sequences"])

If you want per-token embeddings, you can set per_token=True:

x = bert.embed("A sequence", per_token=True)
y = bert.embed(["Multiple", "Sequences"], per_token=True)

easy-bert returns BERT embeddings as numpy arrays

Every time you call bert.embed, a new TensorFlow session is created and used for the computation. If you're calling bert.embed a lot sequentially, you can speed up your code by sharing a TensorFlow session among those calls using a with statement:

with bert:
    x = bert.embed("A sequence", per_token=True)
    y = bert.embed(["Multiple", "Sequences"], per_token=True)

You can save a BERT model using bert.save, then reload it later using Bert.load:

bert.save("/path/to/your/model/")
bert = Bert.load("/path/to/your/model/")

CLI

easy-bert also provides a CLI tool to conveniently do one-off embeddings of sequences with BERT. It can also convert a TensorFlow Hub model to a saved model.

Run bert --help, bert embed --help or bert download --help to get details about the CLI tool.

Docker

easy-bert comes with a docker build that can be used as a base image for applications that rely on bert embeddings or to just run the CLI tool without needing to install an environment.

Java

How To Get It

easy-bert is available on Maven Central. It is also distributed through the releases page.

To add the latest easy-bert release version to your maven project, add the dependency to your pom.xml dependencies section:

<dependencies>
  <dependency>
    <groupId>com.robrua.nlp</groupId>
    <artifactId>easy-bert</artifactId>
    <version>1.0.3</version>
  </dependency>
</dependencies>

Or, if you want to get the latest development version, add the Sonaype Snapshot Repository to your pom.xml as well:

<dependencies>
  <dependency>
    <groupId>com.robrua.nlp</groupId>
    <artifactId>easy-bert</artifactId>
    <version>1.0.4-SNAPSHOT</version>
  </dependency>
</dependencies>

<repositories>
  <repository>
    <id>snapshots-repo</id>
    <url>https://oss.sonatype.org/content/repositories/snapshots</url>
    <releases>
      <enabled>false</enabled>
    </releases>
    <snapshots>
      <enabled>true</enabled>
    </snapshots>
  </repository>
</repositories>

Usage

You can use easy-bert with pre-trained BERT models generated with easy-bert's Python tools. You can also used pre-generated models on Maven Central.

To load a model from your local filesystem, you can use:

try(Bert bert = Bert.load(new File("/path/to/your/model/"))) {
    // Embed some sequences
}

If the model is in your classpath (e.g. if you're pulling it in via Maven), you can use:

try(Bert bert = Bert.load("/resource/path/to/your/model")) {
    // Embed some sequences
}

Once you have a BERT model loaded, you can get sequence embeddings using bert.embedSequence or bert.embedSequences:

float[] embedding = bert.embedSequence("A sequence");
float[][] embeddings = bert.embedSequences("Multiple", "Sequences");

If you want per-token embeddings, you can use bert.embedTokens:

float[][] embedding = bert.embedTokens("A sequence");
float[][][] embeddings = bert.embedTokens("Multiple", "Sequences");

Pre-Generated Maven Central Models

Various TensorFlow Hub BERT models are available in easy-bert format on Maven Central. To use one in your project, add the following to your pom.xml, substituting one of the Artifact IDs listed below in place of ARTIFACT-ID in the artifactId:

<dependencies>
  <dependency>
    <groupId>com.robrua.nlp.models</groupId>
    <artifactId>ARTIFACT-ID</artifactId>
    <version>1.0.0</version>
  </dependency>
</dependencies>

Once you've pulled in the dependency, you can load the model using this code. Substitute the appropriate Resource Path from the list below in place of RESOURCE-PATH based on the model you added as a dependency:

try(Bert bert = Bert.load("RESOURCE-PATH")) {
    // Embed some sequences
}

Available Models

Model Languages Layers Embedding Size Heads Parameters Artifact ID Resource Path
BERT-Base, Uncased English 12 768 12 110M easy-bert-uncased-L-12-H-768-A-12 Maven Central com/robrua/nlp/easy-bert/bert-uncased-L-12-H-768-A-12
BERT-Base, Cased English 12 768 12 110M easy-bert-cased-L-12-H-768-A-12 Maven Central com/robrua/nlp/easy-bert/bert-cased-L-12-H-768-A-12
BERT-Base, Multilingual Cased 104 Languages 12 768 12 110M easy-bert-multi-cased-L-12-H-768-A-12 Maven Central com/robrua/nlp/easy-bert/bert-multi-cased-L-12-H-768-A-12
BERT-Base, Chinese Chinese Simplified and Traditional 12 768 12 110M easy-bert-chinese-L-12-H-768-A-12 Maven Central com/robrua/nlp/easy-bert/bert-chinese-L-12-H-768-A-12

Creating Your Own Models

For now, easy-bert can only use pre-trained TensorFlow Hub BERT models that have been converted using the Python tools. We will be adding support for fine-tuning and pre-training new models easily, but there are no plans to support these on the Java side. You'll need to train in Python, save the model, then load it in Java.

Bugs

If you find bugs please let us know via a pull request or issue.

Citing easy-bert

If you used easy-bert for your research, please cite the project.




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
google-research/bert: TensorFlow code and pre-trained models for BERT发布时间:2022-06-11
下一篇:
Sponsor @ra1028 on GitHub Sponsors · GitHub发布时间:2022-06-11
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap