Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There was an error processing your request: An error occurred. #823

Open
Soumyaranjan-17 opened this issue Dec 18, 2024 · 0 comments
Open

Comments

@Soumyaranjan-17
Copy link

Soumyaranjan-17 commented Dec 18, 2024

Describe the bug

I have ollama and also have qwen2.5-coder:14b model
I have made a file name modelFile

Q1: why it is requesting localhost:1234/v1/models

Ollama is running on its default port

FROM qwen2.5-coder:14b
PARAMETER num_ctx 32768

After starting the server and a chat, it shows the error
There was an error processing your request: An error occurred.

Link to the Bolt URL that caused the error

http://localhost:3000/chat/8

Steps to reproduce

  1. start the server
  2. start a chat

Expected behavior

This will show a error like this
image

Screen Recording / Screenshot

image
image
image

Here I found the port 1234

async function getLMStudioModels(_apiKeys?: Record<string, string>, settings?: IProviderSetting): Promise<ModelInfo[]> {
  try {
    const baseUrl = settings?.baseUrl || import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234';
    const response = await fetch(`${baseUrl}/v1/models`);
    const data = (await response.json()) as any;

    return data.data.map((model: any) => ({
      name: model.id,
      label: model.id,
      provider: 'LMStudio',
    }));
  } catch (e: any) {
    logStore.logError('Failed to get LMStudio models', e, { baseUrl: settings?.baseUrl });
    return [];
  }
}

if we can fix this ?

Platform

  • OS: [e.g. Window11]
  • Browser: [Edge, Chrome]
  • Version: [ Latest ]

Provider Used

No response

Model Used

No response

Additional context

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant