View source on GitHub
|
Models are combinations of tf.keras layers and models that can be trained.
Several pre-built canned models are provided to train encoder networks. These models are intended as both convenience functions and canonical examples.
Classes
class BertClassifier: Classifier model based on a BERT-style transformer-based encoder.
class BertPretrainer: BERT pretraining model.
class BertPretrainerV2: BERT pretraining model V2.
class BertSpanLabeler: Span labeler model based on a BERT-style transformer-based encoder.
class BertTokenClassifier: Token classifier model based on a BERT-style transformer-based encoder.
class DualEncoder: A dual encoder model based on a transformer-based encoder.
class ElectraPretrainer: ELECTRA network training model.
class Seq2SeqTransformer: Transformer model with Keras.
class T5Transformer: Transformer Encoder+Decoder for sequence to sequence.
class T5TransformerParams: Transformer parameters.
class TransformerDecoder: Transformer decoder.
class TransformerEncoder: Transformer encoder.
class XLNetClassifier: Classifier model based on XLNet.
class XLNetPretrainer: XLNet-based pretrainer.
class XLNetSpanLabeler: Span labeler model based on XLNet.
Functions
attention_initializer(...): Initializer for attention layers in Seq2SeqTransformer.
Other Members | |
|---|---|
| EOS_ID |
1
|
View source on GitHub