diff --git a/.changes/unreleased/Fixed and Improvements-20241031-155759.yaml b/.changes/unreleased/Fixed and Improvements-20241031-155759.yaml deleted file mode 100644 index da0f5a77d6a5..000000000000 --- a/.changes/unreleased/Fixed and Improvements-20241031-155759.yaml +++ /dev/null @@ -1,3 +0,0 @@ -kind: Fixed and Improvements -body: bump llama.cpp version to b3995 to support minicpm3 model arch -time: 2024-10-31T15:57:59.392614+08:00 diff --git a/.changes/v0.20.0.md b/.changes/v0.20.0.md new file mode 100644 index 000000000000..6e5655817043 --- /dev/null +++ b/.changes/v0.20.0.md @@ -0,0 +1,12 @@ +## v0.20.0 (2024-11-08) + +### Features + +* Search results can now be edited directly. +* Allow switching backend chat models in Answer Engine. +* Added a connection test button in the `System` tab to test the connection to the backend LLM server. + +### Fixes and Improvements + +* Optimized CR-LF inference in code completion. ([#3279](https://github.com/TabbyML/tabby/issues/3279)) +* Bumped `llama.cpp` version to `b3995`. diff --git a/CHANGELOG.md b/CHANGELOG.md index 6d41aa8166ba..19e61b78463f 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -5,6 +5,19 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html), and is generated by [Changie](https://github.com/miniscruff/changie). +## v0.20.0 (2024-11-08) + +### Features + +* Search results can now be edited directly. +* Allow switching backend chat models in Answer Engine. +* Added a connection test button in the `System` tab to test the connection to the backend LLM server. + +### Fixes and Improvements + +* Optimized CR-LF inference in code completion. ([#3279](https://github.com/TabbyML/tabby/issues/3279)) +* Bumped `llama.cpp` version to `b3995`. + ## v0.19.0 (2024-10-30) ### Features