Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: BedrockConverse updating the system_prompt affects different chat_engines #17265

Open
rebenely opened this issue Dec 13, 2024 · 1 comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@rebenely
Copy link

Bug Description

https://github.com/run-llama/llama_index/blame/095d410249f6bd8e571275993b418af688ca2daf/llama-index-integrations/llms/llama-index-llms-bedrock-converse/llama_index/llms/bedrock_converse/base.py#L285

The system_prompt includes previous messages and responses. This causes separate chat_engines to "know" data outside its index.

Version

0.12.2

Steps to Reproduce

from llama_index.core import VectorStoreIndex
from llama_index.core import Document
from llama_index.llms.bedrock_converse import BedrockConverse
from llama_index.core import Settings
from botocore.session import Session
from llama_index.embeddings.bedrock import BedrockEmbedding


# setup session for aws
session = Session()
Settings.llm = BedrockConverse(model="meta.llama3-1-8b-instruct-v1:0", botocore_session=session)
Settings.embed_model = BedrockEmbedding(model="amazon.titan-embed-text-v1", botocore_session=session)

# build index
flamingo_index = VectorStoreIndex.from_documents([
    Document(text="The secret word is 'flamingo'.")
])

# configure retriever
chat_engine_flamingo = flamingo_index.as_chat_engine(
    chat_mode="context"
)


print(chat_engine_flamingo.chat("What are the secret words?"))


# build another index
lumen_index = VectorStoreIndex.from_documents([
    Document(text="The secret word is 'lumen'.")
])

# configure retriever
chat_engine_lumen = lumen_index.as_chat_engine(
    chat_mode="context"
)

print(chat_engine_lumen.chat("What are the secret words?"))

In this example, chat_engine_lumen should only know 'lumen' as the secret word. But due to the issue, it knows from the system prompt that 'flamingo' is also a secret word.

Relevant Logs/Tracbacks

The secret word is 'flamingo'.


The secret words are 'flamingo' and 'lumen'.
@rebenely rebenely added bug Something isn't working triage Issue needs to be triaged/prioritized labels Dec 13, 2024
Copy link

dosubot bot commented Dec 13, 2024

Hey there, @rebenely! I'm here to help you with any coding bugs, questions, or to guide you on becoming a contributor. Let's tackle this bug together!

The issue you're encountering with the BedrockConverse class in the Llama Index integration is due to the way the system_prompt is constructed. The system_prompt is built using the messages_to_prompt function, which includes previous messages in the prompt. This can lead to data leakage between different chat engines, as they inadvertently share context and knowledge that should be isolated [1].

To resolve this issue, you need to ensure that each chat engine maintains its own isolated context. This can be achieved by modifying the implementation to prevent the sharing of system_prompt data between different chat engines. Each chat engine should have its own instance of the system_prompt that is constructed solely from its own messages and context, without including messages from other engines.

This might involve changes to how the system_prompt is initialized and managed within the BedrockConverse class, ensuring that it is specific to each chat engine instance and does not carry over any information from other instances.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

1 participant