View on GitHub

maximal

A TensorFlow-compatible Python library that provides models and layers to implement custom Transformer neural networks. Built on TensorFlow 2.

PositionalEmbedding()

Double tensorflow.keras.layers.Embedding() layer to learn token and position representations, respectively. This differs from the original formulation of positional encoding, and is based on the SOTA of Transformers.

Inherits from tensorflow.keras.layers.Layer.

Arguments

__init__ arguments:

call arguments:

Returns

Used in tutorial