Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve chat input #472

Merged
merged 9 commits into from
Nov 10, 2024
Merged

Improve chat input #472

merged 9 commits into from
Nov 10, 2024

Conversation

samlhuillier
Copy link
Collaborator

No description provided.

Copy link

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Summary

This PR significantly refactors the chat interface and LLM configuration management in Reor. Here's a focused review of the key changes:

  • Introduces new hooks useAgentConfig and useLLMConfigs to centralize LLM and agent configuration management, though missing error handling and cleanup in useAgentConfig
  • Potential race condition in use-llm-configs.ts when setting default LLM during config fetch
  • Inconsistent toast notification settings between global config and error handling in App.tsx
  • Fixed width textarea in ChatInput.tsx (600px) could cause responsive layout issues
  • Uncaught error potential in UI when LLM selection is missing in ChatMessages.tsx

The changes improve code organization but introduce some reliability concerns that should be addressed before merging.

8 file(s) reviewed, 13 comment(s)
Edit PR Review Bot Settings | Greptile

hideProgressBar={false}
closeOnClick
pauseOnHover
/>{' '}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

syntax: unnecessary trailing whitespace after ToastContainer closing tag

Comment on lines 30 to 33
if (!e.shiftKey && e.key === 'Enter') {
e.preventDefault()
handleSubmitNewMessage()
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

logic: Submit message is now triggered even when input is empty, unlike previous version which checked userTextFieldInput

className={userTextFieldInput ? 'cursor-pointer' : ''}
/>
</div>
<div className=" flex w-full ">
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

syntax: Extra space in className string (' flex' and trailing space)

autoFocus
/>
<div className="mx-auto h-px w-[96%] bg-background/20" />
<div className="flex h-10 flex-col items-center justify-between gap-2 py-2 md:flex-row md:gap-4">
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

syntax: Extra space in gap-2 py-2

Comment on lines 88 to 92
const handleSubmitNewMessage = async () => {
if (userTextFieldInput) {
// this for v1 could just use the default agent config...
const agentConfigs = await window.electronStore.getAgentConfigs()
if (agentConfigs && agentConfigs.length > 0) {
handleNewChatMessage(userTextFieldInput, agentConfigs[0])
} else {
handleNewChatMessage(userTextFieldInput)
if (!selectedLLM) {
throw new Error('Select an LLM.')
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

logic: Throwing an error directly will crash the UI. Should handle this gracefully with toast/alert instead.

Comment on lines +9 to +10
selectedLLM: string | undefined
setSelectedLLM: (value: string | undefined) => void
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

logic: selectedLLM type allows undefined but handleLLMChange assumes string. Add type guard or handle undefined case.

Comment on lines +14 to +21
const fetchAgentConfigs = async () => {
const agentConfigs = await window.electronStore.getAgentConfigs()
if (agentConfigs && agentConfigs.length > 0) {
setAgentConfig(agentConfigs[0])
} else {
setAgentConfig(exampleAgents[0])
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

logic: Missing try/catch block around electronStore call. API calls should handle potential failures.

Comment on lines +13 to +23
useEffect(() => {
const fetchAgentConfigs = async () => {
const agentConfigs = await window.electronStore.getAgentConfigs()
if (agentConfigs && agentConfigs.length > 0) {
setAgentConfig(agentConfigs[0])
} else {
setAgentConfig(exampleAgents[0])
}
}
fetchAgentConfigs()
}, [])
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

style: Add cleanup function to cancel pending operations if component unmounts during fetch

Comment on lines 13 to 18
if (!storedDefaultLLM && fetchedLLMConfigs.length > 0) {
await window.llm.setDefaultLLM(fetchedLLMConfigs[0].modelName)
setDefaultLLM(fetchedLLMConfigs[0].modelName)
setDefaultLocalLLM(fetchedLLMConfigs[0].modelName)
} else {
setDefaultLLM(storedDefaultLLM)
setDefaultLocalLLM(storedDefaultLLM)
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

logic: Race condition possible between checking storedDefaultLLM and setting it. Should await the setDefaultLLM call before setting local state.

@@ -4,20 +4,25 @@ import { LLMConfig } from 'electron/main/electron-store/storeConfig'

const useLLMConfigs = () => {
const [llmConfigs, setLLMConfigs] = useState<LLMConfig[]>([])
const [defaultLLM, setDefaultLLM] = useState<string>('')
const [defaultLLM, setDefaultLocalLLM] = useState<string>('')
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

style: Consider initializing defaultLLM as undefined instead of empty string to better represent unset state

@samlhuillier samlhuillier merged commit 1500024 into main Nov 10, 2024
6 of 10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant