From e8c90231469eb87c4b7b42e26c4ed95473aed1a9 Mon Sep 17 00:00:00 2001 From: Meng Zhang Date: Sun, 10 Nov 2024 07:37:44 -0800 Subject: [PATCH 1/2] docs(changelog): add 0.20.0 changelog --- .../Fixed and Improvements-20241031-155759.yaml | 3 --- .changes/v0.20.0.md | 12 ++++++++++++ CHANGELOG.md | 13 +++++++++++++ 3 files changed, 25 insertions(+), 3 deletions(-) delete mode 100644 .changes/unreleased/Fixed and Improvements-20241031-155759.yaml create mode 100644 .changes/v0.20.0.md diff --git a/.changes/unreleased/Fixed and Improvements-20241031-155759.yaml b/.changes/unreleased/Fixed and Improvements-20241031-155759.yaml deleted file mode 100644 index da0f5a77d6a5..000000000000 --- a/.changes/unreleased/Fixed and Improvements-20241031-155759.yaml +++ /dev/null @@ -1,3 +0,0 @@ -kind: Fixed and Improvements -body: bump llama.cpp version to b3995 to support minicpm3 model arch -time: 2024-10-31T15:57:59.392614+08:00 diff --git a/.changes/v0.20.0.md b/.changes/v0.20.0.md new file mode 100644 index 000000000000..e1c7b6fec2fb --- /dev/null +++ b/.changes/v0.20.0.md @@ -0,0 +1,12 @@ +## v0.20.0 (2024-11-10) + +### Features + +* Search results can now be edited directly. +* Allow switching backend chat models in Answer Engine. +* Added a connection test button in the `System` tab to test the connection to the backend LLM server. + +### Fixes and Improvements + +* Optimized CR-LF inference in code completion. ([#3279](https://github.com/TabbyML/tabby/issues/3279)) +* Bumped `llama.cpp` version to `b3995`. diff --git a/CHANGELOG.md b/CHANGELOG.md index 6d41aa8166ba..d02f39cceab0 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -5,6 +5,19 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html), and is generated by [Changie](https://github.com/miniscruff/changie). +## v0.20.0 (2024-11-10) + +### Features + +* Search results can now be edited directly. +* Allow switching backend chat models in Answer Engine. +* Added a connection test button in the `System` tab to test the connection to the backend LLM server. + +### Fixes and Improvements + +* Optimized CR-LF inference in code completion. ([#3279](https://github.com/TabbyML/tabby/issues/3279)) +* Bumped `llama.cpp` version to `b3995`. + ## v0.19.0 (2024-10-30) ### Features From 1f1b12510848311ea4b8f4024b8e24d241af4809 Mon Sep 17 00:00:00 2001 From: Meng Zhang Date: Sun, 10 Nov 2024 07:39:47 -0800 Subject: [PATCH 2/2] update --- .changes/v0.20.0.md | 2 +- CHANGELOG.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/.changes/v0.20.0.md b/.changes/v0.20.0.md index e1c7b6fec2fb..6e5655817043 100644 --- a/.changes/v0.20.0.md +++ b/.changes/v0.20.0.md @@ -1,4 +1,4 @@ -## v0.20.0 (2024-11-10) +## v0.20.0 (2024-11-08) ### Features diff --git a/CHANGELOG.md b/CHANGELOG.md index d02f39cceab0..19e61b78463f 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -5,7 +5,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html), and is generated by [Changie](https://github.com/miniscruff/changie). -## v0.20.0 (2024-11-10) +## v0.20.0 (2024-11-08) ### Features