Official Documentation.
Version: 1.2.1
A TensorFlow-compatible Python library that provides models and layers to implement custom Transformer neural networks. Built on TensorFlow 2.
Structure of the library:
layers
- class SelfAttention. Self-Attention implementation.
- class MultiHeadSelfAttention. Multi Head Self-Attention implementation.
- class PositionalEmbedding. Embeddings of tokens end positions.
- class ImageEmbedding. Embeddings of image patches, to build Vision Transformer models.
- class TransformerLayer. Transformer Encoder layer.
- class GPTLayer. GPT Layer.
models
(coming soon)
Tutorials
- A Transformer Neural Network for Sentiment Analysis. (Google Colab)
- Building a Vision Transformer for Image Classification. (Google Colab)
- Neural Text Generation with a Custom GPT. (Google Colab)
- Save and load models.
Author
Ivan Bongiorni
Data Scientist, Associate Director
UBS, Emerging Solutions
Zurich, Switzerland.
ivanbongiorni@protonmail.com