-
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove LLM semantics from Chat models #14
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #14 +/- ##
==========================================
+ Coverage 84.87% 85.44% +0.58%
==========================================
Files 13 14 +1
Lines 687 714 +27
==========================================
+ Hits 583 610 +27
Misses 104 104
Continue to review full report in Codecov by Sentry.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @philippzagar; looks great to me! 🚀
Remove LLM semantics from Chat models
♻️ Current situation & Problem
As of now, SpeziChat contains some LLM semantics in the
Chat
andChatEntity
models. This mixes two different components that should live independent of each other.⚙️ Release Notes
📚 Documentation
Updates documentation
✅ Testing
--
📝 Code of Conduct & Contributing Guidelines
By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines: