Keras library for building (Universal) Transformers, facilitating BERT and GPT models
Experimental module for AST transformations
Transformer seq2seq model, program that can build a language translator from parallel corpus
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
TensorFlow code and pre-trained models for BERT
Pytorch Implementation of Transformers Explained with Comments