Edit model card

Transformers from Scratch

This project consists of code for Transformer Block, Single Head Attention and Multi-head attention and Casual Mask from Scratch.

Model Details

Model Description

To solidify knowledge and for reference, attention block is based on paper "Attention is all you need".

image/png

image/png

image/png

  • Developed by: Michael Peres
  • Model type: Vanilla Transformer from Scratch
  • Language(s) (NLP): English
  • License: MIT

Model Sources

Uses

[More Information Needed]

How to Get Started with the Model

Use the code below to get started with the model.

[More Information Needed]

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

  • Hardware Type: RTX 3070Ti
  • Hours used: 0.1hr

Model Architecture and Objective

Objective in this model was to understand Transformers, and the basic self attention module. Self Attention, Multi-Head Attention and Casual Mask and Transformer Block

Model Card Contact

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .