Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Defaults don't seem to persist to chattr call #49

Open
sdadsetan-concert opened this issue Oct 20, 2023 · 5 comments
Open

Defaults don't seem to persist to chattr call #49

sdadsetan-concert opened this issue Oct 20, 2023 · 5 comments

Comments

@sdadsetan-concert
Copy link

Hello! Lovely little idea and package here. Been trying to get it running, but running into some snags.

Let me preface that I've been able to successfully run LlamaGPTJ-chat on the machine so I know that piece is working. I've been able to set the defaults and run chattr_test() to see that it's using the correct path and model, however when I go to run chattr from the console, it failed. I used the preview flag and found that it still references the old default directories. Attached screenshots to show.

Defaults are correct:
Screenshot 2023-10-20 at 3 32 10 PM

Test is correct and runs, but preview flag shows a different path and therefore fails.
Screenshot 2023-10-20 at 3 32 59 PM

@EdwardJ1n
Copy link

Hi there, I was running into the similar issue. And, the following methods seems to fix this issue specifically for me.
Specify the type as console:
chattr_defaults(type="console",path = "/llama/chat-ubuntu-latest-avx2", model = "/llama/ggml-gpt4all-j-v1.3-groovy.bin")

Hope it helps.
Thanks,
Edward

@sdadsetan-concert
Copy link
Author

Thanks. Didn't work in my case.

@edgararuiz
Copy link
Collaborator

I made some updates and I think this is fixes now, can you confirm?

> chattr_defaults()

── chattr ──────────────────────────────────────────────────────────────────────────

── Defaults for: Default ──

── Prompt:Use the R language, the tidyverse, and tidymodels

── ModelProvider: LlamaGPTPath/URL: Library/LlamaGPTJ-chat/build/bin/chatModel: Library/ggml-gpt4all-j-v1.3-groovy.bin

── Model Arguments:threads: 4temp: 0.01n_predict: 1000

── Context: 
Max Data Files: 0
Max Data Frames: 0Chat HistoryDocument contents
> chattr("test", preview = TRUE)

── chattr ──────────────────────────────────────────────────────────────────────────

── Preview for: ConsoleProvider: LlamaGPTPath/URL: Library/LlamaGPTJ-chat/build/bin/chatModel: Library/ggml-gpt4all-j-v1.3-groovy.binthreads: 4temp: 0.01n_predict: 1000

── Prompt: 
test(Use the R language, the tidyverse, and tidymodels)

@EdwardJ1n
Copy link

EdwardJ1n commented Mar 28, 2024 via email

@frankiethull
Copy link

TLDR; defaults persist now!

hi all - I was also running into this issue back in November 2023. I had pivoted to gpt4all via Python..

However, I noticed the chattr changelog (CRAN release: 2024-04-27) and figured I'd read up on the project again. Based on my tests, the edits @edgararuiz mentioned are working!

Changed defaults successfully, chattr_test() was successful, and chattr() works as intended now.
image

@edgararuiz thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants