Keras library for building (Universal) Transformers, facilitating BERT and GPT models
Transformer seq2seq model, program that can build a language translator from parallel corpus
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.