Skip to content

Commit

Permalink
Rename Chat Model Options table to Chat Model as short & readable (#1003
Browse files Browse the repository at this point in the history
)

- Previous was incorrectly plural but was defining only a single model
- Rename chat model table field to name
- Update documentation
- Update references functions and variables to match new name
  • Loading branch information
debanjum authored Dec 12, 2024
1 parent 9be26e1 commit 01bc6d3
Show file tree
Hide file tree
Showing 26 changed files with 369 additions and 308 deletions.
2 changes: 1 addition & 1 deletion documentation/docs/advanced/litellm.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Using LiteLLM with Khoj makes it possible to turn any LLM behind an API into you
- Name: `proxy-name`
- Api Key: `any string`
- Api Base Url: **URL of your Openai Proxy API**
4. Create a new [Chat Model Option](http://localhost:42110/server/admin/database/chatmodeloptions/add) on your Khoj admin panel.
4. Create a new [Chat Model](http://localhost:42110/server/admin/database/chatmodel/add) on your Khoj admin panel.
- Name: `llama3.1` (replace with the name of your local model)
- Model Type: `Openai`
- Openai Config: `<the proxy config you created in step 3>`
Expand Down
2 changes: 1 addition & 1 deletion documentation/docs/advanced/lmstudio.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ LM Studio can expose an [OpenAI API compatible server](https://lmstudio.ai/docs/
- Name: `proxy-name`
- Api Key: `any string`
- Api Base Url: `http://localhost:1234/v1/` (default for LMStudio)
4. Create a new [Chat Model Option](http://localhost:42110/server/admin/database/chatmodeloptions/add) on your Khoj admin panel.
4. Create a new [Chat Model](http://localhost:42110/server/admin/database/chatmodel/add) on your Khoj admin panel.
- Name: `llama3.1` (replace with the name of your local model)
- Model Type: `Openai`
- Openai Config: `<the proxy config you created in step 3>`
Expand Down
2 changes: 1 addition & 1 deletion documentation/docs/advanced/ollama.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Restart your Khoj server after first run or update to the settings below to ensu
- Name: `ollama`
- Api Key: `any string`
- Api Base Url: `http://localhost:11434/v1/` (default for Ollama)
4. Create a new [Chat Model Option](http://localhost:42110/server/admin/database/chatmodeloptions/add) on your Khoj admin panel.
4. Create a new [Chat Model](http://localhost:42110/server/admin/database/chatmodel/add) on your Khoj admin panel.
- Name: `llama3.1` (replace with the name of your local model)
- Model Type: `Openai`
- Openai Config: `<the ollama config you created in step 3>`
Expand Down
2 changes: 1 addition & 1 deletion documentation/docs/advanced/use-openai-proxy.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ For specific integrations, see our [Ollama](/advanced/ollama), [LMStudio](/advan
- Name: `any name`
- Api Key: `any string`
- Api Base Url: **URL of your Openai Proxy API**
3. Create a new [Chat Model Option](http://localhost:42110/server/admin/database/chatmodeloptions/add) on your Khoj admin panel.
3. Create a new [Chat Model](http://localhost:42110/server/admin/database/chatmodel/add) on your Khoj admin panel.
- Name: `llama3` (replace with the name of your local model)
- Model Type: `Openai`
- Openai Config: `<the proxy config you created in step 2>`
Expand Down
8 changes: 4 additions & 4 deletions documentation/docs/get-started/setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -307,7 +307,7 @@ Using Ollama? See the [Ollama Integration](/advanced/ollama) section for more cu
- Give the configuration a friendly name like `OpenAI`
- (Optional) Set the API base URL. It is only relevant if you're using another OpenAI-compatible proxy server like [Ollama](/advanced/ollama) or [LMStudio](/advanced/lmstudio).<br />
![example configuration for ai model api](/img/example_openai_processor_config.png)
2. Create a new [chat model options](http://localhost:42110/server/admin/database/chatmodeloptions/add)
2. Create a new [chat model](http://localhost:42110/server/admin/database/chatmodel/add)
- Set the `chat-model` field to an [OpenAI chat model](https://platform.openai.com/docs/models). Example: `gpt-4o`.
- Make sure to set the `model-type` field to `OpenAI`.
- If your model supports vision, set the `vision enabled` field to `true`. This is currently only supported for OpenAI models with vision capabilities.
Expand All @@ -318,7 +318,7 @@ Using Ollama? See the [Ollama Integration](/advanced/ollama) section for more cu
1. Create a new [AI Model API](http://localhost:42110/server/admin/database/aimodelapi/add) in the server admin settings.
- Add your [Anthropic API key](https://console.anthropic.com/account/keys)
- Give the configuration a friendly name like `Anthropic`. Do not configure the API base url.
2. Create a new [chat model options](http://localhost:42110/server/admin/database/chatmodeloptions/add)
2. Create a new [chat model](http://localhost:42110/server/admin/database/chatmodel/add)
- Set the `chat-model` field to an [Anthropic chat model](https://docs.anthropic.com/en/docs/about-claude/models#model-names). Example: `claude-3-5-sonnet-20240620`.
- Set the `model-type` field to `Anthropic`.
- Set the `ai model api` field to the Anthropic AI Model API you created in step 1.
Expand All @@ -327,7 +327,7 @@ Using Ollama? See the [Ollama Integration](/advanced/ollama) section for more cu
1. Create a new [AI Model API](http://localhost:42110/server/admin/database/aimodelapi/add) in the server admin settings.
- Add your [Gemini API key](https://aistudio.google.com/app/apikey)
- Give the configuration a friendly name like `Gemini`. Do not configure the API base url.
2. Create a new [chat model options](http://localhost:42110/server/admin/database/chatmodeloptions/add)
2. Create a new [chat model](http://localhost:42110/server/admin/database/chatmodel/add)
- Set the `chat-model` field to a [Google Gemini chat model](https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models#gemini-models). Example: `gemini-1.5-flash`.
- Set the `model-type` field to `Gemini`.
- Set the `ai model api` field to the Gemini AI Model API you created in step 1.
Expand All @@ -343,7 +343,7 @@ Offline chat stays completely private and can work without internet using any op
:::

1. Get the name of your preferred chat model from [HuggingFace](https://huggingface.co/models?pipeline_tag=text-generation&library=gguf). *Most GGUF format chat models are supported*.
2. Open the [create chat model page](http://localhost:42110/server/admin/database/chatmodeloptions/add/) on the admin panel
2. Open the [create chat model page](http://localhost:42110/server/admin/database/chatmodel/add/) on the admin panel
3. Set the `chat-model` field to the name of your preferred chat model
- Make sure the `model-type` is set to `Offline`
4. Set the newly added chat model as your preferred model in your [User chat settings](http://localhost:42110/settings) and [Server chat settings](http://localhost:42110/server/admin/database/serverchatsettings/).
Expand Down
102 changes: 51 additions & 51 deletions src/khoj/database/adapters/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
from khoj.database.models import (
Agent,
AiModelApi,
ChatModelOptions,
ChatModel,
ClientApplication,
Conversation,
Entry,
Expand Down Expand Up @@ -736,8 +736,8 @@ def get_default_agent():

@staticmethod
def create_default_agent(user: KhojUser):
default_conversation_config = ConversationAdapters.get_default_conversation_config(user)
if default_conversation_config is None:
default_chat_model = ConversationAdapters.get_default_chat_model(user)
if default_chat_model is None:
logger.info("No default conversation config found, skipping default agent creation")
return None
default_personality = prompts.personality.format(current_date="placeholder", day_of_week="placeholder")
Expand All @@ -746,7 +746,7 @@ def create_default_agent(user: KhojUser):

if agent:
agent.personality = default_personality
agent.chat_model = default_conversation_config
agent.chat_model = default_chat_model
agent.slug = AgentAdapters.DEFAULT_AGENT_SLUG
agent.name = AgentAdapters.DEFAULT_AGENT_NAME
agent.privacy_level = Agent.PrivacyLevel.PUBLIC
Expand All @@ -760,7 +760,7 @@ def create_default_agent(user: KhojUser):
name=AgentAdapters.DEFAULT_AGENT_NAME,
privacy_level=Agent.PrivacyLevel.PUBLIC,
managed_by_admin=True,
chat_model=default_conversation_config,
chat_model=default_chat_model,
personality=default_personality,
slug=AgentAdapters.DEFAULT_AGENT_SLUG,
)
Expand All @@ -787,7 +787,7 @@ async def aupdate_agent(
output_modes: List[str],
slug: Optional[str] = None,
):
chat_model_option = await ChatModelOptions.objects.filter(chat_model=chat_model).afirst()
chat_model_option = await ChatModel.objects.filter(name=chat_model).afirst()

# Slug will be None for new agents, which will trigger a new agent creation with a generated, immutable slug
agent, created = await Agent.objects.filter(slug=slug, creator=user).aupdate_or_create(
Expand Down Expand Up @@ -972,29 +972,29 @@ async def adelete_conversation_by_user(

@staticmethod
@require_valid_user
def has_any_conversation_config(user: KhojUser):
return ChatModelOptions.objects.filter(user=user).exists()
def has_any_chat_model(user: KhojUser):
return ChatModel.objects.filter(user=user).exists()

@staticmethod
def get_all_conversation_configs():
return ChatModelOptions.objects.all()
def get_all_chat_models():
return ChatModel.objects.all()

@staticmethod
async def aget_all_conversation_configs():
return await sync_to_async(list)(ChatModelOptions.objects.prefetch_related("ai_model_api").all())
async def aget_all_chat_models():
return await sync_to_async(list)(ChatModel.objects.prefetch_related("ai_model_api").all())

@staticmethod
def get_vision_enabled_config():
conversation_configurations = ConversationAdapters.get_all_conversation_configs()
for config in conversation_configurations:
chat_models = ConversationAdapters.get_all_chat_models()
for config in chat_models:
if config.vision_enabled:
return config
return None

@staticmethod
async def aget_vision_enabled_config():
conversation_configurations = await ConversationAdapters.aget_all_conversation_configs()
for config in conversation_configurations:
chat_models = await ConversationAdapters.aget_all_chat_models()
for config in chat_models:
if config.vision_enabled:
return config
return None
Expand All @@ -1010,7 +1010,7 @@ def has_valid_ai_model_api():
@staticmethod
@arequire_valid_user
async def aset_user_conversation_processor(user: KhojUser, conversation_processor_config_id: int):
config = await ChatModelOptions.objects.filter(id=conversation_processor_config_id).afirst()
config = await ChatModel.objects.filter(id=conversation_processor_config_id).afirst()
if not config:
return None
new_config = await UserConversationConfig.objects.aupdate_or_create(user=user, defaults={"setting": config})
Expand All @@ -1026,24 +1026,24 @@ async def aset_user_voice_model(user: KhojUser, model_id: str):
return new_config

@staticmethod
def get_conversation_config(user: KhojUser):
def get_chat_model(user: KhojUser):
subscribed = is_user_subscribed(user)
if not subscribed:
return ConversationAdapters.get_default_conversation_config(user)
return ConversationAdapters.get_default_chat_model(user)
config = UserConversationConfig.objects.filter(user=user).first()
if config:
return config.setting
return ConversationAdapters.get_advanced_conversation_config(user)
return ConversationAdapters.get_advanced_chat_model(user)

@staticmethod
async def aget_conversation_config(user: KhojUser):
async def aget_chat_model(user: KhojUser):
subscribed = await ais_user_subscribed(user)
if not subscribed:
return await ConversationAdapters.aget_default_conversation_config(user)
return await ConversationAdapters.aget_default_chat_model(user)
config = await UserConversationConfig.objects.filter(user=user).prefetch_related("setting").afirst()
if config:
return config.setting
return ConversationAdapters.aget_advanced_conversation_config(user)
return ConversationAdapters.aget_advanced_chat_model(user)

@staticmethod
async def aget_voice_model_config(user: KhojUser) -> Optional[VoiceModelOption]:
Expand All @@ -1064,7 +1064,7 @@ def get_voice_model_config(user: KhojUser) -> Optional[VoiceModelOption]:
return VoiceModelOption.objects.first()

@staticmethod
def get_default_conversation_config(user: KhojUser = None):
def get_default_chat_model(user: KhojUser = None):
"""Get default conversation config. Prefer chat model by server admin > user > first created chat model"""
# Get the server chat settings
server_chat_settings = ServerChatSettings.objects.first()
Expand All @@ -1084,10 +1084,10 @@ def get_default_conversation_config(user: KhojUser = None):
return user_chat_settings.setting

# Get the first chat model if even the user chat settings are not set
return ChatModelOptions.objects.filter().first()
return ChatModel.objects.filter().first()

@staticmethod
async def aget_default_conversation_config(user: KhojUser = None):
async def aget_default_chat_model(user: KhojUser = None):
"""Get default conversation config. Prefer chat model by server admin > user > first created chat model"""
# Get the server chat settings
server_chat_settings: ServerChatSettings = (
Expand Down Expand Up @@ -1117,25 +1117,25 @@ async def aget_default_conversation_config(user: KhojUser = None):
return user_chat_settings.setting

# Get the first chat model if even the user chat settings are not set
return await ChatModelOptions.objects.filter().prefetch_related("ai_model_api").afirst()
return await ChatModel.objects.filter().prefetch_related("ai_model_api").afirst()

@staticmethod
def get_advanced_conversation_config(user: KhojUser):
def get_advanced_chat_model(user: KhojUser):
server_chat_settings = ServerChatSettings.objects.first()
if server_chat_settings is not None and server_chat_settings.chat_advanced is not None:
return server_chat_settings.chat_advanced
return ConversationAdapters.get_default_conversation_config(user)
return ConversationAdapters.get_default_chat_model(user)

@staticmethod
async def aget_advanced_conversation_config(user: KhojUser = None):
async def aget_advanced_chat_model(user: KhojUser = None):
server_chat_settings: ServerChatSettings = (
await ServerChatSettings.objects.filter()
.prefetch_related("chat_advanced", "chat_advanced__ai_model_api")
.afirst()
)
if server_chat_settings is not None and server_chat_settings.chat_advanced is not None:
return server_chat_settings.chat_advanced
return await ConversationAdapters.aget_default_conversation_config(user)
return await ConversationAdapters.aget_default_chat_model(user)

@staticmethod
async def aget_server_webscraper():
Expand Down Expand Up @@ -1247,16 +1247,16 @@ def save_conversation(

@staticmethod
def get_conversation_processor_options():
return ChatModelOptions.objects.all()
return ChatModel.objects.all()

@staticmethod
def set_conversation_processor_config(user: KhojUser, new_config: ChatModelOptions):
def set_user_chat_model(user: KhojUser, chat_model: ChatModel):
user_conversation_config, _ = UserConversationConfig.objects.get_or_create(user=user)
user_conversation_config.setting = new_config
user_conversation_config.setting = chat_model
user_conversation_config.save()

@staticmethod
async def aget_user_conversation_config(user: KhojUser):
async def aget_user_chat_model(user: KhojUser):
config = (
await UserConversationConfig.objects.filter(user=user).prefetch_related("setting__ai_model_api").afirst()
)
Expand Down Expand Up @@ -1288,33 +1288,33 @@ async def aget_conversation_starters(user: KhojUser, max_results=3):
return random.sample(all_questions, max_results)

@staticmethod
def get_valid_conversation_config(user: KhojUser, conversation: Conversation):
def get_valid_chat_model(user: KhojUser, conversation: Conversation):
agent: Agent = conversation.agent if AgentAdapters.get_default_agent() != conversation.agent else None
if agent and agent.chat_model:
conversation_config = conversation.agent.chat_model
chat_model = conversation.agent.chat_model
else:
conversation_config = ConversationAdapters.get_conversation_config(user)
chat_model = ConversationAdapters.get_chat_model(user)

if conversation_config is None:
conversation_config = ConversationAdapters.get_default_conversation_config()
if chat_model is None:
chat_model = ConversationAdapters.get_default_chat_model()

if conversation_config.model_type == ChatModelOptions.ModelType.OFFLINE:
if chat_model.model_type == ChatModel.ModelType.OFFLINE:
if state.offline_chat_processor_config is None or state.offline_chat_processor_config.loaded_model is None:
chat_model = conversation_config.chat_model
max_tokens = conversation_config.max_prompt_size
state.offline_chat_processor_config = OfflineChatProcessorModel(chat_model, max_tokens)
chat_model_name = chat_model.name
max_tokens = chat_model.max_prompt_size
state.offline_chat_processor_config = OfflineChatProcessorModel(chat_model_name, max_tokens)

return conversation_config
return chat_model

if (
conversation_config.model_type
chat_model.model_type
in [
ChatModelOptions.ModelType.ANTHROPIC,
ChatModelOptions.ModelType.OPENAI,
ChatModelOptions.ModelType.GOOGLE,
ChatModel.ModelType.ANTHROPIC,
ChatModel.ModelType.OPENAI,
ChatModel.ModelType.GOOGLE,
]
) and conversation_config.ai_model_api:
return conversation_config
) and chat_model.ai_model_api:
return chat_model

else:
raise ValueError("Invalid conversation config - either configure offline chat or openai chat")
Expand Down
16 changes: 8 additions & 8 deletions src/khoj/database/admin.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
from khoj.database.models import (
Agent,
AiModelApi,
ChatModelOptions,
ChatModel,
ClientApplication,
Conversation,
Entry,
Expand Down Expand Up @@ -212,15 +212,15 @@ class KhojUserSubscription(unfold_admin.ModelAdmin):
list_filter = ("type",)


@admin.register(ChatModelOptions)
class ChatModelOptionsAdmin(unfold_admin.ModelAdmin):
@admin.register(ChatModel)
class ChatModelAdmin(unfold_admin.ModelAdmin):
list_display = (
"id",
"chat_model",
"name",
"ai_model_api",
"max_prompt_size",
)
search_fields = ("id", "chat_model", "ai_model_api__name")
search_fields = ("id", "name", "ai_model_api__name")


@admin.register(TextToImageModelConfig)
Expand Down Expand Up @@ -385,7 +385,7 @@ class UserConversationConfigAdmin(unfold_admin.ModelAdmin):
"get_chat_model",
"get_subscription_type",
)
search_fields = ("id", "user__email", "setting__chat_model", "user__subscription__type")
search_fields = ("id", "user__email", "setting__name", "user__subscription__type")
ordering = ("-updated_at",)

def get_user_email(self, obj):
Expand All @@ -395,10 +395,10 @@ def get_user_email(self, obj):
get_user_email.admin_order_field = "user__email" # type: ignore

def get_chat_model(self, obj):
return obj.setting.chat_model if obj.setting else None
return obj.setting.name if obj.setting else None

get_chat_model.short_description = "Chat Model" # type: ignore
get_chat_model.admin_order_field = "setting__chat_model" # type: ignore
get_chat_model.admin_order_field = "setting__name" # type: ignore

def get_subscription_type(self, obj):
if hasattr(obj.user, "subscription"):
Expand Down
Loading

0 comments on commit 01bc6d3

Please sign in to comment.