Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

create new notebook example #74

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

create new notebook example #74

wants to merge 1 commit into from

Conversation

samgdf
Copy link

@samgdf samgdf commented Nov 8, 2024

uses custom chain, per feedback add as an example and link out instead of in the docs https://github.com/rungalileo/docs-mintlify/pull/86

uses custom chain
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Copy link
Member

@setu4993 setu4993 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we set env vars here the way we do in other examples, please? That way, we can easily build tests from this.

"cell_type": "code",
"source": [
"for i, prompt in enumerate(inputs):\n",
" if prompt[\"role\"] == \"system\":\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you fix this? And include the entire history below?

" tool_output = get_delivery_date(order_id)\n",
" wf.add_tool(\n",
" input=response.choices[0].message.tool_calls[0].function.arguments,\n",
" output=f\"Your delivery date is {tool_output}.\",\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would log the 'output' of the tool as-is (without putting into a "Your delivery date is..." string).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then separately, if this ""Your delivery date is {tool_output}."" is what you're showing to the user, log that as a step

" )\n",
"\n",
" # On LLM steps, also run Protect\n",
" if output_message.content is not None:\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's an if output_message.content is not None already. Combine the two? Or re-arrange the code?

" )\n",
"\n",
" # Conclude the workflow\n",
" wf.conclude(output={\"output\": output_message.content})\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're not using the output of the tool, or the output of protect here.

"\n",
" # On LLM steps, also run Protect\n",
" if output_message.content is not None:\n",
" response_protect = protect_galileo(prompt[\"content\"], output_message.content)\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do something with the response? e.g. if protect triggers, alter the response?

"source": [
"# Conversational Flow\n",
"inputs = [\n",
"{\"role\": \"system\", \"content\": \"You are a helpful customer support assistant. Use the supplied tools to assist the user.\"},\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would not include the system message as an input here. it's not an input from 'the user'. it's an instruction / your prompt template.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants