Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confirmation prompts should ideally come from gptscript and not LLM when executing commands. #663

Open
sangee2004 opened this issue Jul 25, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@sangee2004
Copy link
Contributor

gptscript version v0.0.0-dev-671f0f02-dirty

Steps to reproduce the problem:

  1. Execute cli-demo script - gptscript --disable-cache github.com/gptscript-ai/cli-demo
  2. Try to chat with k8s agent to get it to do execute any k8s command.
  3. Notice that sometimes confirmation prompts comes from LLM when executing commands.

Expected Behavior
Confirmation prompts should ideally come from gptscript.

We see the LLM prompt for confirmation:

> scale up  nginx deployment to 5

  I will scale the  nginx  deployment to 5 replicas. Let's proceed.                                                                                                  
                                                                                                                                                                     
  Executing the command:                                                                                                                                             
                                                                                                                                                                     
    kubectl scale deployment nginx --replicas=5                                                                                                                      
                                                                                                                                                                     
  Shall I go ahead?                                                                                                                                                  

> no

  Alright, I won't proceed with scaling the deployment. If there's anything else you need, feel free to let me know!                                                 

It should ideally have come from gptscript

> scale up  nginx deployment to 5

    ┌────────────────────────────────────────────────────┐
    │ Call Arguments:                                    │
    │                                                    │
    │ k8s {"task":"scale deployment nginx --replicas=5"} │
    └────────────────────────────────────────────────────┘
                                                          
    ┌─────────────────┐
    │ Call Arguments: │
    │                 │
    │ exec {"         │
    └─────────────────┘
                       
Run "kubectl scale deployment nginx --replicas=5" (or allow all "kubectl scale ..." commands)
Confirm (y/n/a)> 
@sangee2004 sangee2004 added the bug Something isn't working label Jul 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant