A simple Next.js chat app that uses Mixtral MOE through Together.ai. It shows off how to do streaming with open source LLMs using Next.js and Together.ai.
This project uses Mixtral MOE through Together.ai's serverless endpoints and Vercel Edge functions with streaming. It takes the prompt the user specifies, sends it to Mixtral MOE via a Together serverless endpoint, then streams the response back to the application.
After cloning the repo, go to Together.ai to make an account and put your API key in a file called .env
.
Then, run the application in the command line and it will be available at http://localhost:3000
.
npm run dev
For version 2 of this app, here are some features I want to add:
- Auth with Clerk.
- Header with profile pic + name + clone github on the right. On the left, select item to pick what model to use (mistral, Llama, ect...).
- Share and Rewrite buttons like Perplexity.
- No sidebar. Just chatbox on the bottom similar to chatGPT or Pi with a disclaimer.
- Search by pressing enter in the text box.
- Migrate app to the Next.js app router.
- Make sure the app scrolls as the text comes in.
Get inspo from: