Skip to content

Commit

Permalink
v0.11.22 (#16839)
Browse files Browse the repository at this point in the history
  • Loading branch information
logan-markewich authored Nov 5, 2024
1 parent 0eb8fd0 commit 56358e5
Show file tree
Hide file tree
Showing 13 changed files with 181 additions and 23 deletions.
70 changes: 70 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,75 @@
# ChangeLog

## [2024-11-05]

### `llama-index-core` [0.11.22]

- bring back support for prompt templates in context chat engines (#16821)
- Fixed the JSON Format of Generated Sub-Question (double curly brackets) (#16820)
- markdown splitter improve metadata (#16789)
- fix empty index + generation synthesizer (#16785)

### `llama-index-embeddings-azure-inference` [0.2.4]

- Support for api_version and Azure AI model inference service (#16802)

### `llama-index-embeddings-gemini` [0.2.2]

- fix await-async-embeddings (#16790)

### `llama-index-embeddings-siliconflow` [0.1.0]

- add siliconflow embedding class (#16753)

### `llama-index-indices-managed-vectara` [0.2.4]

- Hotfix: Chain Query Configuration (#16818)

### `llama-index-llms-anthropic` [0.3.9]

- Add Anthropic Claude Haiku 3.5 to the list of supported Claude models (#16823)

### `llama-index-llms-azure-inference` [0.2.4]

- Support for api_version and Azure AI model inference service (#16802)

### `llama-index-llms-bedrock` [0.2.6]

- Add Anthropic Claude Haiku 3.5 to the list of supported Claude models for bedrock and bedrock-converse integrations (#16825)

### `llama-index-llms-bedrock-converse` [0.3.7]

- Add Anthropic Claude Haiku 3.5 to the list of supported Claude models for bedrock and bedrock-converse integrations (#16825)

### `llama-index-llms-dashscope` [0.2.5]

- More tolerant definition of LLMMetadata information (#16830)
- Fix abstract method signature error (#16809)

### `llama-index-llms-vllm` [0.3.0]

- remove beam search param for latest vllm (#16817)

### `llama-index-postprocessor-colpali-rerank` [0.1.0]

- Add ColPali as reranker (#16829)

### `llama-index-postprocessor-siliconflow-rerank` [0.1.0]

- add siliconflow rerank class (#16737)

### `llama-index-readers-microsoft-onedrive` [0.2.2]

- fix: add required_exts for one drive reader (#16822)

### `llama-index-vector-stores-chroma` [0.3.0]

- Support breaking changes to filter syntax in latest chroma (#16806)

### `llama-index-vector-stores-pinecone` [0.3.0]

- support sparse embedding models, fix delete for serverless for pinecone (#16819)

## [2024-10-31]

### `llama-index-core` [0.11.21]
Expand Down
72 changes: 71 additions & 1 deletion docs/docs/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,83 @@
# ChangeLog

## [2024-11-05]

### `llama-index-core` [0.11.22]

- bring back support for prompt templates in context chat engines (#16821)
- Fixed the JSON Format of Generated Sub-Question (double curly brackets) (#16820)
- markdown splitter improve metadata (#16789)
- fix empty index + generation synthesizer (#16785)

### `llama-index-embeddings-azure-inference` [0.2.4]

- Support for api_version and Azure AI model inference service (#16802)

### `llama-index-embeddings-gemini` [0.2.2]

- fix await-async-embeddings (#16790)

### `llama-index-embeddings-siliconflow` [0.1.0]

- add siliconflow embedding class (#16753)

### `llama-index-indices-managed-vectara` [0.2.4]

- Hotfix: Chain Query Configuration (#16818)

### `llama-index-llms-anthropic` [0.3.9]

- Add Anthropic Claude Haiku 3.5 to the list of supported Claude models (#16823)

### `llama-index-llms-azure-inference` [0.2.4]

- Support for api_version and Azure AI model inference service (#16802)

### `llama-index-llms-bedrock` [0.2.6]

- Add Anthropic Claude Haiku 3.5 to the list of supported Claude models for bedrock and bedrock-converse integrations (#16825)

### `llama-index-llms-bedrock-converse` [0.3.7]

- Add Anthropic Claude Haiku 3.5 to the list of supported Claude models for bedrock and bedrock-converse integrations (#16825)

### `llama-index-llms-dashscope` [0.2.5]

- More tolerant definition of LLMMetadata information (#16830)
- Fix abstract method signature error (#16809)

### `llama-index-llms-vllm` [0.3.0]

- remove beam search param for latest vllm (#16817)

### `llama-index-postprocessor-colpali-rerank` [0.1.0]

- Add ColPali as reranker (#16829)

### `llama-index-postprocessor-siliconflow-rerank` [0.1.0]

- add siliconflow rerank class (#16737)

### `llama-index-readers-microsoft-onedrive` [0.2.2]

- fix: add required_exts for one drive reader (#16822)

### `llama-index-vector-stores-chroma` [0.3.0]

- Support breaking changes to filter syntax in latest chroma (#16806)

### `llama-index-vector-stores-pinecone` [0.3.0]

- support sparse embedding models, fix delete for serverless for pinecone (#16819)

## [2024-10-31]

### `llama-index-core` [0.11.21]

- Fixed issue with default value set as None for workflow `ctx.get()` (#16756)
- fix various issues with react agent streaming (#16755)
- add unit test for query pipeline (#16749)
- Fix _merge_ref_doc_kv_pairs duped for-loop (#16739)
- Fix \_merge_ref_doc_kv_pairs duped for-loop (#16739)
- bugfix: determine if nodes is none when creating index (#16703)
- fixes LLMRerank default_parse_choice_select_answer_fn parsing issue (#16736)
- fix return type check on workflows (#16724)
Expand Down
6 changes: 6 additions & 0 deletions docs/docs/api_reference/callbacks/opentelemetry.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
::: llama_index.callbacks.opentelemetry
options:
members:
- OpenTelemetryEventHandler
- OpenTelemetrySpanHandler
- instrument_opentelemetry
4 changes: 4 additions & 0 deletions docs/docs/api_reference/llms/sambanovacloud.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
::: llama_index.llms.sambanovacloud
options:
members:
- SambaNovaCloud
4 changes: 4 additions & 0 deletions docs/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -852,6 +852,7 @@ nav:
- ./api_reference/callbacks/literalai.md
- ./api_reference/callbacks/llama_debug.md
- ./api_reference/callbacks/openinference.md
- ./api_reference/callbacks/opentelemetry.md
- ./api_reference/callbacks/opik.md
- ./api_reference/callbacks/promptlayer.md
- ./api_reference/callbacks/token_counter.md
Expand Down Expand Up @@ -1038,6 +1039,7 @@ nav:
- ./api_reference/llms/rungpt.md
- ./api_reference/llms/sagemaker_endpoint.md
- ./api_reference/llms/sambanova.md
- ./api_reference/llms/sambanovacloud.md
- ./api_reference/llms/solar.md
- ./api_reference/llms/text_generation_inference.md
- ./api_reference/llms/together.md
Expand Down Expand Up @@ -2302,6 +2304,8 @@ plugins:
- ../llama-index-integrations/embeddings/llama-index-embeddings-siliconflow
- ../llama-index-integrations/memory/llama-index-memory-mem0
- ../llama-index-integrations/postprocessor/llama-index-postprocessor-siliconflow-rerank
- ../llama-index-integrations/callbacks/llama-index-callbacks-opentelemetry
- ../llama-index-integrations/llms/llama-index-llms-sambanovacloud
- redirects:
redirect_maps:
./api/llama_index.vector_stores.MongoDBAtlasVectorSearch.html: api_reference/storage/vector_store/mongodb.md
Expand Down
2 changes: 1 addition & 1 deletion llama-index-core/llama_index/core/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
"""Init file of LlamaIndex."""

__version__ = "0.11.21"
__version__ = "0.11.22"

import logging
from logging import NullHandler
Expand Down
2 changes: 1 addition & 1 deletion llama-index-core/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ name = "llama-index-core"
packages = [{include = "llama_index"}]
readme = "README.md"
repository = "https://github.com/run-llama/llama_index"
version = "0.11.21"
version = "0.11.22"

[tool.poetry.dependencies]
SQLAlchemy = {extras = ["asyncio"], version = ">=1.4.49"}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,12 @@ exclude = ["**/BUILD"]
license = "MIT"
name = "llama-index-embeddings-siliconflow"
readme = "README.md"
version = "0.1.0"
version = "0.1.1"

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
llama-index-core = "^0.11.0"
aiohttp = "*"

[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,12 @@ exclude = ["**/BUILD"]
license = "MIT"
name = "llama-index-embeddings-textembed"
readme = "README.md"
version = "0.1.0"
version = "0.1.1"

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
llama-index-core = "^0.11.0"
aiohttp = "*"

[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,12 @@ exclude = ["**/BUILD"]
license = "MIT"
name = "llama-index-embeddings-xinference"
readme = "README.md"
version = "0.1.0"
version = "0.1.1"

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
llama-index-core = "^0.11.0"
aiohttp = "*"

[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,12 +25,13 @@ authors = ["Your Name <[email protected]>"]
description = "llama-index llms sambanova cloud integration"
name = "llama-index-llms-sambanovacloud"
readme = "README.md"
version = "0.3.1"
version = "0.3.2"

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
llama-index-core = "^0.11.0"
python-dotenv = "^1.0.1"
aiohttp = "*"

[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
Expand Down
28 changes: 14 additions & 14 deletions poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ name = "llama-index"
packages = [{from = "_llama-index", include = "llama_index"}]
readme = "README.md"
repository = "https://github.com/run-llama/llama_index"
version = "0.11.21"
version = "0.11.22"

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
Expand All @@ -58,7 +58,7 @@ llama-index-agent-openai = "^0.3.4"
llama-index-readers-file = "^0.2.0"
llama-index-readers-llama-parse = ">=0.3.0"
llama-index-indices-managed-llama-cloud = ">=0.3.0"
llama-index-core = "^0.11.20"
llama-index-core = "^0.11.22"
llama-index-multi-modal-llms-openai = "^0.2.0"
llama-index-cli = "^0.3.1"
nltk = ">3.8.1" # avoids a CVE, temp until next release, should be in llama-index-core
Expand Down

0 comments on commit 56358e5

Please sign in to comment.