Skip to content

This is a repo for making a llm from scratch using keras and tensorflow .

Notifications You must be signed in to change notification settings

yashpadale/Transformer_From_scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Keras Transformer Model
This repository contains a Keras implementation of a Transformer model, which can be trained on various text data for tasks such as language translation, text generation, and more. The model architecture is based on the Transformer architecture introduced in the paper "Attention is All You Need" by Vaswani et al.

Requirements
Python 3.x
TensorFlow 2.x
Keras
Installation
Clone this repository to your local machine:
  git clone https://github.com/yashpadale/Transformer_From_scratch
Acknowledgments
This implementation is inspired by the original Transformer paper by Vaswani et al. and various open-source Transformer implementations.

About

This is a repo for making a llm from scratch using keras and tensorflow .

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages