Keras library for building (Universal) Transformers, facilitating BERT and GPT models
Experimental module for AST transformations
Transformer seq2seq model, program that can build a language translator from parallel corpus
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
TensorFlow code and pre-trained models for BERT