diff --git a/notebook/agentchat_swarm_graphrag_telemetry_trip_planner.ipynb b/notebook/agentchat_swarm_graphrag_telemetry_trip_planner.ipynb new file mode 100644 index 0000000000..ccb8ffb84c --- /dev/null +++ b/notebook/agentchat_swarm_graphrag_telemetry_trip_planner.ipynb @@ -0,0 +1,1906 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Using a local Telemetry server to monitor a GraphRAG agent\n", + "\n", + "In this notebook, we're building a trip planning swarm which has an objective to create an itinerary together with a customer. The end result will be an itinerary that has route times and distances calculated between activities.\n", + "\n", + "The following diagram outlines the key components of the Swarm, with highlights being:\n", + "\n", + "- FalkorDB agent using a GraphRAG database of restaurants and attractions\n", + "- Arize Phoenix to provide transparency using the OpenTelemetry standard\n", + "- Structured Output agent that will enforce a strict format for the accepted itinerary\n", + "- Routing agent that utilises the Google Maps API to calculate distances between activites\n", + "- Swarm orchestration utilising context variables" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Initialize Python environment MANUALLY\n", + "\n", + "- clone the AG2 repository and run from the project root directory\n", + "- there was an issue with the FalkorDB GraphRAG dependencies and using this approach for now was easiest\n", + "\n", + "### First step\n", + "```bash\n", + "pyenv local 3.11\n", + "```\n", + "\n", + "### Second step\n", + "```bash\n", + "python -m venv .venv \n", + "source .venv/bin/activate \n", + "```\n", + "\n", + "### Third step\n", + "```bash\n", + "pip install -e .\n", + "pip install graphrag_sdk\n", + "pip install .[graph_rag_falkor_db]\n", + "pip install ipykernel\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Install Docker Containers\n", + "\n", + "**Note:** This likely requirs a docker compose file to get it to work reliably. It didn't require it on my system, but plan to come back to it at some point.\n", + "\n", + "For now I'm more interested in instrumenting GraphRAG solutions to get better visibility to key interations with the LLM -- in particular for entity and link detection." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### FalkorDB\n", + "\n", + "- UI endpoint: http://localhost:3000/graph\n", + "- sample query: `match path = ()-[]-() return path`" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 1, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# if you run the FalkorDB image with the rm flag it is removed after the container is stopped\n", + "# for more information refer to: https://docs.falkordb.com/\n", + "\n", + "# !docker run -p 6379:6379 -p 3000:3000 -it --rm falkordb/falkordb:latest\n", + "\n", + "import subprocess\n", + "\n", + "# Run the Docker container without interactive mode\n", + "subprocess.Popen([\n", + " \"docker\", \"run\", \"-p\", \"6379:6379\", \"-p\", \"3000:3000\",\n", + " \"--rm\", \"falkordb/falkordb:latest\"\n", + "])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Arize Phoenix\n", + "\n", + "- UI endpoint: http://localhost:6006" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# for more information refer to https://docs.arize.com/phoenix/tracing/integrations-tracing/autogen-support#docker\n", + "# !docker run -p 6006:6006 -p 4317:4317 arizephoenix/phoenix:latest\n", + "\n", + "import subprocess\n", + "\n", + "# Run the Docker container without interactive mode\n", + "subprocess.Popen([\n", + " \"docker\", \"run\", \"-p\", \"6006:6006\", \"-p\", \"4317:4317\",\n", + " \"--rm\", \"arizephoenix/phoenix:latest\"\n", + "])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Arize Phoenix: setup and configuration" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "11:C 08 Dec 2024 22:31:32.482 * oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo\n", + "11:C 08 Dec 2024 22:31:32.482 * Redis version=7.2.4, bits=64, commit=00000000, modified=0, pid=11, just started\n", + "11:C 08 Dec 2024 22:31:32.483 * Configuration loaded\n", + "11:M 08 Dec 2024 22:31:32.483 * monotonic clock: POSIX clock_gettime\n", + "11:M 08 Dec 2024 22:31:32.483 * Running mode=standalone, port=6379.\n", + "11:M 08 Dec 2024 22:31:32.489 * Enabled role change notification\n", + "11:M 08 Dec 2024 22:31:32.489 * Starting up FalkorDB version 4.4.0.\n", + "11:M 08 Dec 2024 22:31:32.490 * Thread pool created, using 20 threads.\n", + "11:M 08 Dec 2024 22:31:32.490 * Maximum number of OpenMP threads set to 20\n", + "11:M 08 Dec 2024 22:31:32.490 * Query backlog size: 1000\n", + "11:M 08 Dec 2024 22:31:32.490 * Module 'graph' loaded from /FalkorDB/bin/src/falkordb.so\n", + "11:M 08 Dec 2024 22:31:32.490 * Server initialized\n", + "11:M 08 Dec 2024 22:31:32.490 * Ready to accept connections tcp\n", + " β–² Next.js 14.1.0\n", + " - Local: http://localhost:3000\n", + " - Network: http://0.0.0.0:3000\n", + "\n", + " βœ“ Ready in 74ms\n", + "\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.0\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m24.3.1\u001b[0m\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpip install --upgrade pip\u001b[0m\n" + ] + } + ], + "source": [ + "!pip install -q arize-phoenix-otel" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "πŸ”­ OpenTelemetry Tracing Details πŸ”­\n", + "| Phoenix Project: ag2-swarm-graphrag\n", + "| Span Processor: SimpleSpanProcessor\n", + "| Collector Endpoint: localhost:4317\n", + "| Transport: gRPC\n", + "| Transport Headers: {'user-agent': '****'}\n", + "| \n", + "| Using a default SpanProcessor. `add_span_processor` will overwrite this default.\n", + "| \n", + "| `register` has set this TracerProvider as the global OpenTelemetry default.\n", + "| To disable this behavior, call `register` with `set_global_tracer_provider=False`.\n", + "\n" + ] + } + ], + "source": [ + "from phoenix.otel import register\n", + "\n", + "# defaults to endpoint=\"http://localhost:4317\"\n", + "tracer_provider = register(\n", + " project_name=\"ag2-swarm-graphrag\", # Default is 'default'\n", + " endpoint=\"http://localhost:4317\", # Sends traces using gRPC\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.0\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m24.3.1\u001b[0m\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpip install --upgrade pip\u001b[0m\n", + "πŸƒβ€β™€οΈβ€βž‘οΈ Running migrations on the database.\n", + "---------------------------\n", + "2024-12-08 22:31:35,465 INFO sqlalchemy.engine.Engine BEGIN (implicit)\n", + "2024-12-08 22:31:35,465 INFO sqlalchemy.engine.Engine PRAGMA main.table_info(\"alembic_version\")\n", + "2024-12-08 22:31:35,465 INFO sqlalchemy.engine.Engine [raw sql] ()\n", + "2024-12-08 22:31:35,466 INFO sqlalchemy.engine.Engine PRAGMA temp.table_info(\"alembic_version\")\n", + "2024-12-08 22:31:35,466 INFO sqlalchemy.engine.Engine [raw sql] ()\n", + "2024-12-08 22:31:35,466 INFO sqlalchemy.engine.Engine PRAGMA main.table_info(\"alembic_version\")\n", + "2024-12-08 22:31:35,466 INFO sqlalchemy.engine.Engine [raw sql] ()\n", + "2024-12-08 22:31:35,466 INFO sqlalchemy.engine.Engine PRAGMA temp.table_info(\"alembic_version\")\n", + "2024-12-08 22:31:35,466 INFO sqlalchemy.engine.Engine [raw sql] ()\n", + "2024-12-08 22:31:35,466 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE alembic_version (\n", + "\tversion_num VARCHAR(32) NOT NULL, \n", + "\tCONSTRAINT alembic_version_pkc PRIMARY KEY (version_num)\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,466 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,475 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE projects (\n", + "\tid INTEGER NOT NULL, \n", + "\tname VARCHAR NOT NULL, \n", + "\tdescription VARCHAR, \n", + "\tgradient_start_color VARCHAR DEFAULT '#5bdbff' NOT NULL, \n", + "\tgradient_end_color VARCHAR DEFAULT '#1c76fc' NOT NULL, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tupdated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_projects PRIMARY KEY (id), \n", + "\tCONSTRAINT uq_projects_name UNIQUE (name)\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,475 INFO sqlalchemy.engine.Engine [no key 0.00008s] ()\n", + "2024-12-08 22:31:35,479 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE traces (\n", + "\tid INTEGER NOT NULL, \n", + "\tproject_rowid INTEGER NOT NULL, \n", + "\ttrace_id VARCHAR NOT NULL, \n", + "\tstart_time TIMESTAMP NOT NULL, \n", + "\tend_time TIMESTAMP NOT NULL, \n", + "\tCONSTRAINT pk_traces PRIMARY KEY (id), \n", + "\tCONSTRAINT fk_traces_project_rowid_projects FOREIGN KEY(project_rowid) REFERENCES projects (id) ON DELETE CASCADE, \n", + "\tCONSTRAINT uq_traces_trace_id UNIQUE (trace_id)\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,479 INFO sqlalchemy.engine.Engine [no key 0.00008s] ()\n", + "2024-12-08 22:31:35,482 INFO sqlalchemy.engine.Engine CREATE INDEX ix_traces_start_time ON traces (start_time)\n", + "2024-12-08 22:31:35,482 INFO sqlalchemy.engine.Engine [no key 0.00008s] ()\n", + "2024-12-08 22:31:35,486 INFO sqlalchemy.engine.Engine CREATE INDEX ix_traces_project_rowid ON traces (project_rowid)\n", + "2024-12-08 22:31:35,486 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,490 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE spans (\n", + "\tid INTEGER NOT NULL, \n", + "\ttrace_rowid INTEGER NOT NULL, \n", + "\tspan_id VARCHAR NOT NULL, \n", + "\tparent_id VARCHAR, \n", + "\tname VARCHAR NOT NULL, \n", + "\tspan_kind VARCHAR NOT NULL, \n", + "\tstart_time TIMESTAMP NOT NULL, \n", + "\tend_time TIMESTAMP NOT NULL, \n", + "\tattributes JSONB NOT NULL, \n", + "\tevents JSONB NOT NULL, \n", + "\tstatus_code VARCHAR DEFAULT 'UNSET' NOT NULL CONSTRAINT \"ck_spans_`valid_status`\" CHECK (status_code IN ('OK', 'ERROR', 'UNSET')), \n", + "\tstatus_message VARCHAR NOT NULL, \n", + "\tcumulative_error_count INTEGER NOT NULL, \n", + "\tcumulative_llm_token_count_prompt INTEGER NOT NULL, \n", + "\tcumulative_llm_token_count_completion INTEGER NOT NULL, \n", + "\tCONSTRAINT pk_spans PRIMARY KEY (id), \n", + "\tCONSTRAINT fk_spans_trace_rowid_traces FOREIGN KEY(trace_rowid) REFERENCES traces (id) ON DELETE CASCADE, \n", + "\tCONSTRAINT uq_spans_span_id UNIQUE (span_id)\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,490 INFO sqlalchemy.engine.Engine [no key 0.00009s] ()\n", + "2024-12-08 22:31:35,493 INFO sqlalchemy.engine.Engine CREATE INDEX ix_spans_parent_id ON spans (parent_id)\n", + "2024-12-08 22:31:35,493 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,496 INFO sqlalchemy.engine.Engine CREATE INDEX ix_spans_trace_rowid ON spans (trace_rowid)\n", + "2024-12-08 22:31:35,496 INFO sqlalchemy.engine.Engine [no key 0.00008s] ()\n", + "2024-12-08 22:31:35,500 INFO sqlalchemy.engine.Engine CREATE INDEX ix_spans_start_time ON spans (start_time)\n", + "2024-12-08 22:31:35,500 INFO sqlalchemy.engine.Engine [no key 0.00012s] ()\n", + "2024-12-08 22:31:35,504 INFO sqlalchemy.engine.Engine CREATE INDEX ix_latency ON spans ((end_time - start_time))\n", + "2024-12-08 22:31:35,504 INFO sqlalchemy.engine.Engine [no key 0.00011s] ()\n", + "2024-12-08 22:31:35,508 INFO sqlalchemy.engine.Engine CREATE INDEX ix_cumulative_llm_token_count_total ON spans ((cumulative_llm_token_count_prompt + cumulative_llm_token_count_completion))\n", + "2024-12-08 22:31:35,508 INFO sqlalchemy.engine.Engine [no key 0.00014s] ()\n", + "2024-12-08 22:31:35,512 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE span_annotations (\n", + "\tid INTEGER NOT NULL, \n", + "\tspan_rowid INTEGER NOT NULL, \n", + "\tname VARCHAR NOT NULL, \n", + "\tlabel VARCHAR, \n", + "\tscore FLOAT, \n", + "\texplanation VARCHAR, \n", + "\tmetadata JSONB NOT NULL, \n", + "\tannotator_kind VARCHAR NOT NULL CONSTRAINT \"ck_span_annotations_`valid_annotator_kind`\" CHECK (annotator_kind IN ('LLM', 'HUMAN')), \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tupdated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_span_annotations PRIMARY KEY (id), \n", + "\tCONSTRAINT uq_span_annotations_name_span_rowid UNIQUE (name, span_rowid), \n", + "\tCONSTRAINT fk_span_annotations_span_rowid_spans FOREIGN KEY(span_rowid) REFERENCES spans (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,512 INFO sqlalchemy.engine.Engine [no key 0.00009s] ()\n", + "2024-12-08 22:31:35,515 INFO sqlalchemy.engine.Engine CREATE INDEX ix_span_annotations_span_rowid ON span_annotations (span_rowid)\n", + "2024-12-08 22:31:35,515 INFO sqlalchemy.engine.Engine [no key 0.00009s] ()\n", + "2024-12-08 22:31:35,518 INFO sqlalchemy.engine.Engine CREATE INDEX ix_span_annotations_label ON span_annotations (label)\n", + "2024-12-08 22:31:35,518 INFO sqlalchemy.engine.Engine [no key 0.00008s] ()\n", + "2024-12-08 22:31:35,521 INFO sqlalchemy.engine.Engine CREATE INDEX ix_span_annotations_score ON span_annotations (score)\n", + "2024-12-08 22:31:35,521 INFO sqlalchemy.engine.Engine [no key 0.00010s] ()\n", + "2024-12-08 22:31:35,525 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE trace_annotations (\n", + "\tid INTEGER NOT NULL, \n", + "\ttrace_rowid INTEGER NOT NULL, \n", + "\tname VARCHAR NOT NULL, \n", + "\tlabel VARCHAR, \n", + "\tscore FLOAT, \n", + "\texplanation VARCHAR, \n", + "\tmetadata JSONB NOT NULL, \n", + "\tannotator_kind VARCHAR NOT NULL CONSTRAINT \"ck_trace_annotations_`valid_annotator_kind`\" CHECK (annotator_kind IN ('LLM', 'HUMAN')), \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tupdated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_trace_annotations PRIMARY KEY (id), \n", + "\tCONSTRAINT uq_trace_annotations_name_trace_rowid UNIQUE (name, trace_rowid), \n", + "\tCONSTRAINT fk_trace_annotations_trace_rowid_traces FOREIGN KEY(trace_rowid) REFERENCES traces (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,525 INFO sqlalchemy.engine.Engine [no key 0.00008s] ()\n", + "2024-12-08 22:31:35,528 INFO sqlalchemy.engine.Engine CREATE INDEX ix_trace_annotations_label ON trace_annotations (label)\n", + "2024-12-08 22:31:35,528 INFO sqlalchemy.engine.Engine [no key 0.00009s] ()\n", + "2024-12-08 22:31:35,531 INFO sqlalchemy.engine.Engine CREATE INDEX ix_trace_annotations_score ON trace_annotations (score)\n", + "2024-12-08 22:31:35,531 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,533 INFO sqlalchemy.engine.Engine CREATE INDEX ix_trace_annotations_trace_rowid ON trace_annotations (trace_rowid)\n", + "2024-12-08 22:31:35,533 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,537 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE document_annotations (\n", + "\tid INTEGER NOT NULL, \n", + "\tspan_rowid INTEGER NOT NULL, \n", + "\tdocument_position INTEGER NOT NULL, \n", + "\tname VARCHAR NOT NULL, \n", + "\tlabel VARCHAR, \n", + "\tscore FLOAT, \n", + "\texplanation VARCHAR, \n", + "\tmetadata JSONB NOT NULL, \n", + "\tannotator_kind VARCHAR NOT NULL CONSTRAINT \"ck_document_annotations_`valid_annotator_kind`\" CHECK (annotator_kind IN ('LLM', 'HUMAN')), \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tupdated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_document_annotations PRIMARY KEY (id), \n", + "\tCONSTRAINT uq_document_annotations_name_span_rowid_document_position UNIQUE (name, span_rowid, document_position), \n", + "\tCONSTRAINT fk_document_annotations_span_rowid_spans FOREIGN KEY(span_rowid) REFERENCES spans (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,537 INFO sqlalchemy.engine.Engine [no key 0.00009s] ()\n", + "2024-12-08 22:31:35,540 INFO sqlalchemy.engine.Engine CREATE INDEX ix_document_annotations_score ON document_annotations (score)\n", + "2024-12-08 22:31:35,540 INFO sqlalchemy.engine.Engine [no key 0.00009s] ()\n", + "2024-12-08 22:31:35,543 INFO sqlalchemy.engine.Engine CREATE INDEX ix_document_annotations_label ON document_annotations (label)\n", + "2024-12-08 22:31:35,543 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,546 INFO sqlalchemy.engine.Engine CREATE INDEX ix_document_annotations_span_rowid ON document_annotations (span_rowid)\n", + "2024-12-08 22:31:35,546 INFO sqlalchemy.engine.Engine [no key 0.00009s] ()\n", + "2024-12-08 22:31:35,550 INFO sqlalchemy.engine.Engine INSERT INTO projects (name, description) VALUES (?, ?)\n", + "2024-12-08 22:31:35,550 INFO sqlalchemy.engine.Engine [generated in 0.00015s] ('default', 'Default project')\n", + "2024-12-08 22:31:35,551 INFO sqlalchemy.engine.Engine INSERT INTO alembic_version (version_num) VALUES ('cf03bd6bae1d') RETURNING version_num\n", + "2024-12-08 22:31:35,551 INFO sqlalchemy.engine.Engine [generated in 0.00009s] ()\n", + "2024-12-08 22:31:35,552 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE datasets (\n", + "\tid INTEGER NOT NULL, \n", + "\tname VARCHAR NOT NULL, \n", + "\tdescription VARCHAR, \n", + "\tmetadata JSONB NOT NULL, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tupdated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_datasets PRIMARY KEY (id), \n", + "\tCONSTRAINT uq_datasets_name UNIQUE (name)\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,552 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,552 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE dataset_versions (\n", + "\tid INTEGER NOT NULL, \n", + "\tdataset_id INTEGER NOT NULL, \n", + "\tdescription VARCHAR, \n", + "\tmetadata JSONB NOT NULL, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_dataset_versions PRIMARY KEY (id), \n", + "\tCONSTRAINT fk_dataset_versions_dataset_id_datasets FOREIGN KEY(dataset_id) REFERENCES datasets (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,552 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,553 INFO sqlalchemy.engine.Engine CREATE INDEX ix_dataset_versions_dataset_id ON dataset_versions (dataset_id)\n", + "2024-12-08 22:31:35,553 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,553 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE dataset_examples (\n", + "\tid INTEGER NOT NULL, \n", + "\tdataset_id INTEGER NOT NULL, \n", + "\tspan_rowid INTEGER, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_dataset_examples PRIMARY KEY (id), \n", + "\tCONSTRAINT fk_dataset_examples_dataset_id_datasets FOREIGN KEY(dataset_id) REFERENCES datasets (id) ON DELETE CASCADE, \n", + "\tCONSTRAINT fk_dataset_examples_span_rowid_spans FOREIGN KEY(span_rowid) REFERENCES spans (id) ON DELETE SET NULL\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,553 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,554 INFO sqlalchemy.engine.Engine CREATE INDEX ix_dataset_examples_dataset_id ON dataset_examples (dataset_id)\n", + "2024-12-08 22:31:35,554 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,554 INFO sqlalchemy.engine.Engine CREATE INDEX ix_dataset_examples_span_rowid ON dataset_examples (span_rowid)\n", + "2024-12-08 22:31:35,554 INFO sqlalchemy.engine.Engine [no key 0.00004s] ()\n", + "2024-12-08 22:31:35,555 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE dataset_example_revisions (\n", + "\tid INTEGER NOT NULL, \n", + "\tdataset_example_id INTEGER NOT NULL, \n", + "\tdataset_version_id INTEGER NOT NULL, \n", + "\tinput JSONB NOT NULL, \n", + "\toutput JSONB NOT NULL, \n", + "\tmetadata JSONB NOT NULL, \n", + "\trevision_kind VARCHAR NOT NULL CONSTRAINT \"ck_dataset_example_revisions_`valid_revision_kind`\" CHECK (revision_kind IN ('CREATE', 'PATCH', 'DELETE')), \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_dataset_example_revisions PRIMARY KEY (id), \n", + "\tCONSTRAINT uq_dataset_example_revisions_dataset_example_id_dataset_version_id UNIQUE (dataset_example_id, dataset_version_id), \n", + "\tCONSTRAINT fk_dataset_example_revisions_dataset_example_id_dataset_examples FOREIGN KEY(dataset_example_id) REFERENCES dataset_examples (id) ON DELETE CASCADE, \n", + "\tCONSTRAINT fk_dataset_example_revisions_dataset_version_id_dataset_versions FOREIGN KEY(dataset_version_id) REFERENCES dataset_versions (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,555 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,555 INFO sqlalchemy.engine.Engine CREATE INDEX ix_dataset_example_revisions_dataset_example_id ON dataset_example_revisions (dataset_example_id)\n", + "2024-12-08 22:31:35,555 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,555 INFO sqlalchemy.engine.Engine CREATE INDEX ix_dataset_example_revisions_dataset_version_id ON dataset_example_revisions (dataset_version_id)\n", + "2024-12-08 22:31:35,555 INFO sqlalchemy.engine.Engine [no key 0.00004s] ()\n", + "2024-12-08 22:31:35,556 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE experiments (\n", + "\tid INTEGER NOT NULL, \n", + "\tdataset_id INTEGER NOT NULL, \n", + "\tdataset_version_id INTEGER NOT NULL, \n", + "\tname VARCHAR NOT NULL, \n", + "\tdescription VARCHAR, \n", + "\trepetitions INTEGER NOT NULL, \n", + "\tmetadata JSONB NOT NULL, \n", + "\tproject_name VARCHAR, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tupdated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT pk_experiments PRIMARY KEY (id), \n", + "\tCONSTRAINT fk_experiments_dataset_id_datasets FOREIGN KEY(dataset_id) REFERENCES datasets (id) ON DELETE CASCADE, \n", + "\tCONSTRAINT fk_experiments_dataset_version_id_dataset_versions FOREIGN KEY(dataset_version_id) REFERENCES dataset_versions (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,556 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,556 INFO sqlalchemy.engine.Engine CREATE INDEX ix_experiments_dataset_id ON experiments (dataset_id)\n", + "2024-12-08 22:31:35,556 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,556 INFO sqlalchemy.engine.Engine CREATE INDEX ix_experiments_dataset_version_id ON experiments (dataset_version_id)\n", + "2024-12-08 22:31:35,556 INFO sqlalchemy.engine.Engine [no key 0.00003s] ()\n", + "2024-12-08 22:31:35,556 INFO sqlalchemy.engine.Engine CREATE INDEX ix_experiments_project_name ON experiments (project_name)\n", + "2024-12-08 22:31:35,557 INFO sqlalchemy.engine.Engine [no key 0.00004s] ()\n", + "2024-12-08 22:31:35,558 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE experiment_runs (\n", + "\tid INTEGER NOT NULL, \n", + "\texperiment_id INTEGER NOT NULL, \n", + "\tdataset_example_id INTEGER NOT NULL, \n", + "\trepetition_number INTEGER NOT NULL, \n", + "\ttrace_id VARCHAR, \n", + "\toutput JSONB NOT NULL, \n", + "\tstart_time TIMESTAMP NOT NULL, \n", + "\tend_time TIMESTAMP NOT NULL, \n", + "\tprompt_token_count INTEGER, \n", + "\tcompletion_token_count INTEGER, \n", + "\terror VARCHAR, \n", + "\tCONSTRAINT pk_experiment_runs PRIMARY KEY (id), \n", + "\tCONSTRAINT uq_experiment_runs_experiment_id_dataset_example_id_repetition_number UNIQUE (experiment_id, dataset_example_id, repetition_number), \n", + "\tCONSTRAINT fk_experiment_runs_experiment_id_experiments FOREIGN KEY(experiment_id) REFERENCES experiments (id) ON DELETE CASCADE, \n", + "\tCONSTRAINT fk_experiment_runs_dataset_example_id_dataset_examples FOREIGN KEY(dataset_example_id) REFERENCES dataset_examples (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,558 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,558 INFO sqlalchemy.engine.Engine CREATE INDEX ix_experiment_runs_experiment_id ON experiment_runs (experiment_id)\n", + "2024-12-08 22:31:35,558 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,558 INFO sqlalchemy.engine.Engine CREATE INDEX ix_experiment_runs_dataset_example_id ON experiment_runs (dataset_example_id)\n", + "2024-12-08 22:31:35,558 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,559 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE experiment_run_annotations (\n", + "\tid INTEGER NOT NULL, \n", + "\texperiment_run_id INTEGER NOT NULL, \n", + "\tname VARCHAR NOT NULL, \n", + "\tannotator_kind VARCHAR NOT NULL CONSTRAINT \"ck_experiment_run_annotations_`valid_annotator_kind`\" CHECK (annotator_kind IN ('LLM', 'CODE', 'HUMAN')), \n", + "\tlabel VARCHAR, \n", + "\tscore FLOAT, \n", + "\texplanation VARCHAR, \n", + "\ttrace_id VARCHAR, \n", + "\terror VARCHAR, \n", + "\tmetadata JSONB NOT NULL, \n", + "\tstart_time TIMESTAMP NOT NULL, \n", + "\tend_time TIMESTAMP NOT NULL, \n", + "\tCONSTRAINT pk_experiment_run_annotations PRIMARY KEY (id), \n", + "\tCONSTRAINT uq_experiment_run_annotations_experiment_run_id_name UNIQUE (experiment_run_id, name), \n", + "\tCONSTRAINT fk_experiment_run_annotations_experiment_run_id_experiment_runs FOREIGN KEY(experiment_run_id) REFERENCES experiment_runs (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,559 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,559 INFO sqlalchemy.engine.Engine CREATE INDEX ix_experiment_run_annotations_experiment_run_id ON experiment_run_annotations (experiment_run_id)\n", + "2024-12-08 22:31:35,559 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,560 INFO sqlalchemy.engine.Engine UPDATE alembic_version SET version_num='10460e46d750' WHERE alembic_version.version_num = 'cf03bd6bae1d'\n", + "2024-12-08 22:31:35,560 INFO sqlalchemy.engine.Engine [generated in 0.00024s] ()\n", + "2024-12-08 22:31:35,561 INFO sqlalchemy.engine.Engine ALTER TABLE spans ADD COLUMN llm_token_count_prompt INTEGER\n", + "2024-12-08 22:31:35,561 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,561 INFO sqlalchemy.engine.Engine ALTER TABLE spans ADD COLUMN llm_token_count_completion INTEGER\n", + "2024-12-08 22:31:35,561 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n" + ] + } + ], + "source": [ + "## install python telemetry and openai library requirements\n", + "!pip install -q openinference-instrumentation-openai openai 'httpx<0.28'" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "2024-12-08 22:31:35,563 INFO sqlalchemy.engine.Engine UPDATE spans SET llm_token_count_prompt=(JSON_EXTRACT(spans.attributes, ?)), llm_token_count_completion=(JSON_EXTRACT(spans.attributes, ?))\n", + "2024-12-08 22:31:35,563 INFO sqlalchemy.engine.Engine [generated in 0.00013s] ('$.\"llm\".\"token_count\".\"prompt\"', '$.\"llm\".\"token_count\".\"completion\"')\n", + "2024-12-08 22:31:35,563 INFO sqlalchemy.engine.Engine UPDATE alembic_version SET version_num='3be8647b87d8' WHERE alembic_version.version_num = '10460e46d750'\n", + "2024-12-08 22:31:35,563 INFO sqlalchemy.engine.Engine [generated in 0.00008s] ()\n", + "2024-12-08 22:31:35,564 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE user_roles (\n", + "\tid INTEGER NOT NULL, \n", + "\tname VARCHAR NOT NULL, \n", + "\tCONSTRAINT pk_user_roles PRIMARY KEY (id)\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,564 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,564 INFO sqlalchemy.engine.Engine CREATE UNIQUE INDEX ix_user_roles_name ON user_roles (name)\n", + "2024-12-08 22:31:35,564 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "2024-12-08 22:31:35,565 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE users (\n", + "\tid INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, \n", + "\tuser_role_id INTEGER NOT NULL, \n", + "\tusername VARCHAR NOT NULL, \n", + "\temail VARCHAR NOT NULL, \n", + "\tprofile_picture_url VARCHAR, \n", + "\tpassword_hash BLOB, \n", + "\tpassword_salt BLOB, \n", + "\treset_password BOOLEAN NOT NULL, \n", + "\toauth2_client_id VARCHAR, \n", + "\toauth2_user_id VARCHAR, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tupdated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\tCONSTRAINT \"ck_users_`password_hash_and_salt`\" CHECK ((password_hash IS NULL) = (password_salt IS NULL)), \n", + "\tCONSTRAINT \"ck_users_`oauth2_client_id_and_user_id`\" CHECK ((oauth2_client_id IS NULL) = (oauth2_user_id IS NULL)), \n", + "\tCONSTRAINT \"ck_users_`exactly_one_auth_method`\" CHECK ((password_hash IS NULL) != (oauth2_client_id IS NULL)), \n", + "\tCONSTRAINT uq_users_oauth2_client_id_oauth2_user_id UNIQUE (oauth2_client_id, oauth2_user_id), \n", + "\tCONSTRAINT fk_users_user_role_id_user_roles FOREIGN KEY(user_role_id) REFERENCES user_roles (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,565 INFO sqlalchemy.engine.Engine [no key 0.00009s] ()\n", + "2024-12-08 22:31:35,566 INFO sqlalchemy.engine.Engine CREATE INDEX ix_users_user_role_id ON users (user_role_id)\n", + "2024-12-08 22:31:35,566 INFO sqlalchemy.engine.Engine [no key 0.00010s] ()\n", + "2024-12-08 22:31:35,566 INFO sqlalchemy.engine.Engine CREATE UNIQUE INDEX ix_users_username ON users (username)\n", + "2024-12-08 22:31:35,566 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,566 INFO sqlalchemy.engine.Engine CREATE INDEX ix_users_oauth2_client_id ON users (oauth2_client_id)\n", + "2024-12-08 22:31:35,566 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,567 INFO sqlalchemy.engine.Engine CREATE INDEX ix_users_oauth2_user_id ON users (oauth2_user_id)\n", + "2024-12-08 22:31:35,567 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,567 INFO sqlalchemy.engine.Engine CREATE UNIQUE INDEX ix_users_email ON users (email)\n", + "2024-12-08 22:31:35,567 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,568 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE password_reset_tokens (\n", + "\tid INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, \n", + "\tuser_id INTEGER, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\texpires_at TIMESTAMP NOT NULL, \n", + "\tCONSTRAINT fk_password_reset_tokens_user_id_users FOREIGN KEY(user_id) REFERENCES users (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,568 INFO sqlalchemy.engine.Engine [no key 0.00011s] ()\n", + "2024-12-08 22:31:35,569 INFO sqlalchemy.engine.Engine CREATE UNIQUE INDEX ix_password_reset_tokens_user_id ON password_reset_tokens (user_id)\n", + "2024-12-08 22:31:35,569 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,569 INFO sqlalchemy.engine.Engine CREATE INDEX ix_password_reset_tokens_expires_at ON password_reset_tokens (expires_at)\n", + "2024-12-08 22:31:35,569 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,570 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE refresh_tokens (\n", + "\tid INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, \n", + "\tuser_id INTEGER, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\texpires_at TIMESTAMP NOT NULL, \n", + "\tCONSTRAINT fk_refresh_tokens_user_id_users FOREIGN KEY(user_id) REFERENCES users (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,570 INFO sqlalchemy.engine.Engine [no key 0.00012s] ()\n", + "2024-12-08 22:31:35,570 INFO sqlalchemy.engine.Engine CREATE INDEX ix_refresh_tokens_user_id ON refresh_tokens (user_id)\n", + "2024-12-08 22:31:35,570 INFO sqlalchemy.engine.Engine [no key 0.00008s] ()\n", + "2024-12-08 22:31:35,570 INFO sqlalchemy.engine.Engine CREATE INDEX ix_refresh_tokens_expires_at ON refresh_tokens (expires_at)\n", + "2024-12-08 22:31:35,570 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,572 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE access_tokens (\n", + "\tid INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, \n", + "\tuser_id INTEGER, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\texpires_at TIMESTAMP NOT NULL, \n", + "\trefresh_token_id INTEGER, \n", + "\tCONSTRAINT fk_access_tokens_user_id_users FOREIGN KEY(user_id) REFERENCES users (id) ON DELETE CASCADE, \n", + "\tCONSTRAINT fk_access_tokens_refresh_token_id_refresh_tokens FOREIGN KEY(refresh_token_id) REFERENCES refresh_tokens (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,572 INFO sqlalchemy.engine.Engine [no key 0.00011s] ()\n", + "2024-12-08 22:31:35,572 INFO sqlalchemy.engine.Engine CREATE INDEX ix_access_tokens_expires_at ON access_tokens (expires_at)\n", + "2024-12-08 22:31:35,572 INFO sqlalchemy.engine.Engine [no key 0.00006s] ()\n", + "2024-12-08 22:31:35,572 INFO sqlalchemy.engine.Engine CREATE UNIQUE INDEX ix_access_tokens_refresh_token_id ON access_tokens (refresh_token_id)\n", + "2024-12-08 22:31:35,572 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,572 INFO sqlalchemy.engine.Engine CREATE INDEX ix_access_tokens_user_id ON access_tokens (user_id)\n", + "2024-12-08 22:31:35,572 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,573 INFO sqlalchemy.engine.Engine \n", + "CREATE TABLE api_keys (\n", + "\tid INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, \n", + "\tuser_id INTEGER, \n", + "\tname VARCHAR NOT NULL, \n", + "\tdescription VARCHAR, \n", + "\tcreated_at TIMESTAMP DEFAULT (CURRENT_TIMESTAMP) NOT NULL, \n", + "\texpires_at TIMESTAMP, \n", + "\tCONSTRAINT fk_api_keys_user_id_users FOREIGN KEY(user_id) REFERENCES users (id) ON DELETE CASCADE\n", + ")\n", + "\n", + "\n", + "2024-12-08 22:31:35,573 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,573 INFO sqlalchemy.engine.Engine CREATE INDEX ix_api_keys_user_id ON api_keys (user_id)\n", + "2024-12-08 22:31:35,573 INFO sqlalchemy.engine.Engine [no key 0.00005s] ()\n", + "2024-12-08 22:31:35,573 INFO sqlalchemy.engine.Engine CREATE INDEX ix_api_keys_expires_at ON api_keys (expires_at)\n", + "2024-12-08 22:31:35,574 INFO sqlalchemy.engine.Engine [no key 0.00007s] ()\n", + "2024-12-08 22:31:35,574 INFO sqlalchemy.engine.Engine UPDATE alembic_version SET version_num='cd164e83824f' WHERE alembic_version.version_num = '3be8647b87d8'\n", + "2024-12-08 22:31:35,574 INFO sqlalchemy.engine.Engine [generated in 0.00011s] ()\n", + "2024-12-08 22:31:35,574 INFO sqlalchemy.engine.Engine COMMIT\n", + "---------------------------\n", + "βœ… Migrations complete.\n", + "\n", + "\n", + "β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•—β–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ•—\n", + "β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•\n", + "β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β–ˆβ–ˆβ•— β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ•”β•\n", + "β–ˆβ–ˆβ•”β•β•β•β• β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β• β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β–ˆβ–ˆβ•—\n", + "β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•—\n", + "β•šβ•β• β•šβ•β• β•šβ•β• β•šβ•β•β•β•β•β• β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β•β•β•β•šβ•β•β•šβ•β• β•šβ•β• v6.1.0\n", + "\n", + "|\n", + "| 🌎 Join our Community 🌎\n", + "| https://join.slack.com/t/arize-ai/shared_invite/zt-1px8dcmlf-fmThhDFD_V_48oU7ALan4Q\n", + "|\n", + "| ⭐️ Leave us a Star ⭐️\n", + "| https://github.com/Arize-ai/phoenix\n", + "|\n", + "| πŸ“š Documentation πŸ“š\n", + "| https://docs.arize.com/phoenix\n", + "|\n", + "| πŸš€ Phoenix Server πŸš€\n", + "| Phoenix UI: http://0.0.0.0:6006\n", + "| Authentication: False\n", + "| Websockets: True\n", + "| Log traces:\n", + "| - gRPC: http://0.0.0.0:4317\n", + "| - HTTP: http://0.0.0.0:6006/v1/traces\n", + "| Storage: sqlite:////root/.phoenix/phoenix.db\n", + "\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "INFO: Started server process [1]\n", + "INFO: Waiting for application startup.\n", + "INFO: Application startup complete.\n", + "INFO: Uvicorn running on http://0.0.0.0:6006 (Press CTRL+C to quit)\n" + ] + } + ], + "source": [ + "from openinference.instrumentation.openai import OpenAIInstrumentor\n", + "\n", + "OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Google Maps API Key\n", + "\n", + "To use Google's API to calculate travel times, you will need to have enabled the `Directions API` in your Google Maps Platform. You can get an API key and free quota, see [here](https://developers.google.com/maps/documentation/directions/overview) and [here](https://developers.google.com/maps/get-started) for more details.\n", + "\n", + "Once you have your API key, set your environment variable `GOOGLE_MAP_API_KEY` to the key.\n", + "\n", + "NOTE: I found these instructions confusing as well given how environment variables are being managed overall. One way or another you need to set the `GOOGLE_MAP_API_KEY` environment variable and there should be better visibility to when and why individual API calls fail within AG2." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "\n", + "os.environ[\"GOOGLE_MAP_API_KEY\"] = os.getenv(\"GOOGLE_MAP_API_KEY\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Set Configuration and OpenAI API Key\n", + "\n", + "**Create a OAI_CONFIG_LIST file** in the AG2 project `notebook` directory based on the OAI_CONFIG_LIST_sample file from the root directory.\n", + "\n", + "By default, FalkorDB uses OpenAI LLMs and that requires an OpenAI key in your environment variable `OPENAI_API_KEY`.\n", + "\n", + "You can utilise an OAI_CONFIG_LIST file and extract the OpenAI API key and put it in the environment, as will be shown in the following cell.\n", + "\n", + "Alternatively, you can load the environment variable yourself.\n", + "\n", + "````{=mdx}\n", + ":::tip\n", + "Learn more about configuring LLMs for agents [here](/docs/topics/llm_configuration).\n", + ":::\n", + "````" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "\n", + "import autogen\n", + "\n", + "config_list = autogen.config_list_from_json(env_or_file=\"OAI_CONFIG_LIST\", filter_dict={\"model\": [\"gpt-4o\"]})\n", + "llm_config = {\"config_list\": config_list, \"timeout\": 120}\n", + "\n", + "# Put the OpenAI API key into the environment using the config_list or env variable\n", + "# os.environ[\"OPENAI_API_KEY\"] = config_list[0][\"api_key\"]\n", + "os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Prepare the FalkorDB GraphRAG database\n", + "\n", + "Using 3 sample JSON data files from our GitHub repository, we will create a specific ontology for our GraphRAG database and then populate it.\n", + "\n", + "Creating a specific ontology that matches with the types of queries makes for a more optimal database and is more cost efficient when populating the knowledge graph." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "from autogen.agentchat.contrib.graph_rag.document import Document, DocumentType\n", + "\n", + "# 3 Files (adjust path as necessary)\n", + "input_paths = [\n", + " \"../test/agentchat/contrib/graph_rag/trip_planner_data/attractions.json\",\n", + " \"../test/agentchat/contrib/graph_rag/trip_planner_data/cities.json\",\n", + " \"../test/agentchat/contrib/graph_rag/trip_planner_data/restaurants.json\",\n", + "]\n", + "input_documents = [Document(doctype=DocumentType.TEXT, path_or_url=input_path) for input_path in input_paths]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create Ontology\n", + "\n", + "Entities: Country, City, Attraction, Restaurant\n", + "\n", + "Relationships: City in Country, Attraction in City, Restaurant in City" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "from graphrag_sdk import Attribute, AttributeType, Entity, Ontology, Relation\n", + "\n", + "# Attraction + Restaurant + City + Country Ontology\n", + "trip_data_ontology = Ontology()\n", + "\n", + "trip_data_ontology.add_entity(\n", + " Entity(\n", + " label=\"Country\",\n", + " attributes=[\n", + " Attribute(\n", + " name=\"name\",\n", + " attr_type=AttributeType.STRING,\n", + " required=True,\n", + " unique=True,\n", + " ),\n", + " ],\n", + " )\n", + ")\n", + "trip_data_ontology.add_entity(\n", + " Entity(\n", + " label=\"City\",\n", + " attributes=[\n", + " Attribute(\n", + " name=\"name\",\n", + " attr_type=AttributeType.STRING,\n", + " required=True,\n", + " unique=True,\n", + " ),\n", + " Attribute(\n", + " name=\"weather\",\n", + " attr_type=AttributeType.STRING,\n", + " required=False,\n", + " unique=False,\n", + " ),\n", + " Attribute(\n", + " name=\"population\",\n", + " attr_type=AttributeType.NUMBER,\n", + " required=False,\n", + " unique=False,\n", + " ),\n", + " ],\n", + " )\n", + ")\n", + "trip_data_ontology.add_entity(\n", + " Entity(\n", + " label=\"Restaurant\",\n", + " attributes=[\n", + " Attribute(\n", + " name=\"name\",\n", + " attr_type=AttributeType.STRING,\n", + " required=True,\n", + " unique=True,\n", + " ),\n", + " Attribute(\n", + " name=\"description\",\n", + " attr_type=AttributeType.STRING,\n", + " required=False,\n", + " unique=False,\n", + " ),\n", + " Attribute(\n", + " name=\"rating\",\n", + " attr_type=AttributeType.NUMBER,\n", + " required=False,\n", + " unique=False,\n", + " ),\n", + " Attribute(\n", + " name=\"food_type\",\n", + " attr_type=AttributeType.STRING,\n", + " required=False,\n", + " unique=False,\n", + " ),\n", + " ],\n", + " )\n", + ")\n", + "trip_data_ontology.add_entity(\n", + " Entity(\n", + " label=\"Attraction\",\n", + " attributes=[\n", + " Attribute(\n", + " name=\"name\",\n", + " attr_type=AttributeType.STRING,\n", + " required=True,\n", + " unique=True,\n", + " ),\n", + " Attribute(\n", + " name=\"description\",\n", + " attr_type=AttributeType.STRING,\n", + " required=False,\n", + " unique=False,\n", + " ),\n", + " Attribute(\n", + " name=\"type\",\n", + " attr_type=AttributeType.STRING,\n", + " required=False,\n", + " unique=False,\n", + " ),\n", + " ],\n", + " )\n", + ")\n", + "trip_data_ontology.add_relation(\n", + " Relation(\n", + " label=\"IN_COUNTRY\",\n", + " source=\"City\",\n", + " target=\"Country\",\n", + " )\n", + ")\n", + "trip_data_ontology.add_relation(\n", + " Relation(\n", + " label=\"IN_CITY\",\n", + " source=\"Restaurant\",\n", + " target=\"City\",\n", + " )\n", + ")\n", + "trip_data_ontology.add_relation(\n", + " Relation(\n", + " label=\"IN_CITY\",\n", + " source=\"Attraction\",\n", + " target=\"City\",\n", + " )\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Inialize FalkorDB and Query Engine\n", + "\n", + "Remember: Change your host, port, and preferred OpenAI model if needed (gpt-4o-mini and better is recommended)." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "INFO: 172.17.0.1:47154 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:47170 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:47180 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:47176 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:47170 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:47154 - \"POST /graphql HTTP/1.1\" 200 OK\n" + ] + } + ], + "source": [ + "from graphrag_sdk.models.openai import OpenAiGenerativeModel\n", + "\n", + "from autogen.agentchat.contrib.graph_rag.falkor_graph_query_engine import FalkorGraphQueryEngine\n", + "from autogen.agentchat.contrib.graph_rag.falkor_graph_rag_capability import FalkorGraphRagCapability\n", + "\n", + "# Create FalkorGraphQueryEngine\n", + "query_engine = FalkorGraphQueryEngine(\n", + " name=\"trip_data\",\n", + " host=\"localhost\", # change to a specific IP address if you run into issues connecting to your local instance\n", + " port=6379, # if needed\n", + " ontology=trip_data_ontology,\n", + " model=OpenAiGenerativeModel(\"gpt-4o\"),\n", + ")\n", + "\n", + "# Ingest data and initialize the database\n", + "query_engine.init_db(input_doc=input_documents)\n", + "\n", + "# If you have already ingested and created the database, you can use this connect_db instead of init_db\n", + "# query_engine.connect_db()" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [], + "source": [ + "# IMPORTS\n", + "import copy\n", + "import json\n", + "import os\n", + "from typing import Any, Dict\n", + "\n", + "import requests\n", + "from pydantic import BaseModel\n", + "\n", + "from autogen import (\n", + " AFTER_WORK,\n", + " ON_CONDITION,\n", + " AfterWorkOption,\n", + " SwarmAgent,\n", + " SwarmResult,\n", + " UserProxyAgent,\n", + " initiate_swarm_chat,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Pydantic model for Structured Output\n", + "\n", + "Utilising OpenAI's [Structured Outputs](https://platform.openai.com/docs/guides/structured-outputs), our Structured Output agent's responses will be constrained to this Pydantic model.\n", + "\n", + "The itinerary is structured as:\n", + "Itinerary has Day(s) has Event(s)" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [], + "source": [ + "class Event(BaseModel):\n", + " type: str # Attraction, Restaurant, Travel\n", + " location: str\n", + " city: str\n", + " description: str\n", + "\n", + "\n", + "class Day(BaseModel):\n", + " events: list[Event]\n", + "\n", + "\n", + "class Itinerary(BaseModel):\n", + " days: list[Day]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Google Maps Platform\n", + "\n", + "The functions necessary to query the Directions API to get travel times." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [], + "source": [ + "def _fetch_travel_time(origin: str, destination: str) -> dict:\n", + " \"\"\"\n", + " Retrieves route information using Google Maps Directions API.\n", + " API documentation at https://developers.google.com/maps/documentation/directions/get-directions\n", + " \"\"\"\n", + " endpoint = \"https://maps.googleapis.com/maps/api/directions/json\"\n", + " params = {\n", + " \"origin\": origin,\n", + " \"destination\": destination,\n", + " \"mode\": \"walking\", # driving (default), bicycling, transit\n", + " \"key\": os.environ.get(\"GOOGLE_MAP_API_KEY\"),\n", + " }\n", + "\n", + " response = requests.get(endpoint, params=params)\n", + " if response.status_code == 200:\n", + " return response.json()\n", + " else:\n", + " return {\"error\": \"Failed to retrieve the route information\", \"status_code\": response.status_code}\n", + "\n", + "\n", + "def update_itinerary_with_travel_times(context_variables: dict) -> SwarmResult:\n", + " \"\"\"Update the complete itinerary with travel times between each event.\"\"\"\n", + "\n", + " \"\"\"\n", + " Retrieves route information using Google Maps Directions API.\n", + " API documentation at https://developers.google.com/maps/documentation/directions/get-directions\n", + " \"\"\"\n", + "\n", + " # Ensure that we have a structured itinerary, if not, back to the structured_output_agent to make it\n", + " if context_variables.get(\"structured_itinerary\") is None:\n", + " return SwarmResult(\n", + " agent=\"structured_output_agent\",\n", + " values=\"Structured itinerary not found, please create the structured output, structured_output_agent.\",\n", + " )\n", + " elif \"timed_itinerary\" in context_variables:\n", + " return SwarmResult(values=\"Timed itinerary already done, inform the customer that their itinerary is ready!\")\n", + "\n", + " # Process the itinerary, converting it back to an object and working through each event to work out travel time and distance\n", + " itinerary_object = Itinerary.model_validate(json.loads(context_variables[\"structured_itinerary\"]))\n", + " for day in itinerary_object.days:\n", + " events = day.events\n", + " new_events = []\n", + " pre_event, cur_event = None, None\n", + " event_count = len(events)\n", + " index = 0\n", + " while index < event_count:\n", + " if index > 0:\n", + " pre_event = events[index - 1]\n", + "\n", + " cur_event = events[index]\n", + " if pre_event:\n", + " origin = \", \".join([pre_event.location, pre_event.city])\n", + " destination = \", \".join([cur_event.location, cur_event.city])\n", + " maps_api_response = _fetch_travel_time(origin=origin, destination=destination)\n", + " try:\n", + " leg = maps_api_response[\"routes\"][0][\"legs\"][0]\n", + " travel_time_txt = f\"{leg['duration']['text']}, ({leg['distance']['text']})\"\n", + " new_events.append(\n", + " Event(\n", + " type=\"Travel\",\n", + " location=f\"walking from {pre_event.location} to {cur_event.location}\",\n", + " city=cur_event.city,\n", + " description=travel_time_txt,\n", + " )\n", + " )\n", + " except Exception:\n", + " print(f\"Note: Unable to get travel time from {origin} to {destination}\")\n", + " new_events.append(cur_event)\n", + " index += 1\n", + " day.events = new_events\n", + "\n", + " context_variables[\"timed_itinerary\"] = itinerary_object.model_dump()\n", + "\n", + " return SwarmResult(context_variables=context_variables, values=\"Timed itinerary added to context with travel times\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Swarm\n", + "\n", + "### Context Variables\n", + "Our swarm agents will have access to a couple of context variables in relation to the itinerary." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [], + "source": [ + "trip_context = {\n", + " \"itinerary_confirmed\": False,\n", + " \"itinerary\": \"\",\n", + " \"structured_itinerary\": None,\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Agent Functions\n", + "\n", + "We have two functions/tools for our agents.\n", + "\n", + "One for our Planner agent to mark an itinerary as confirmed by the customer and to store the final text itinerary. This will then transfer to our Structured Output agent.\n", + "\n", + "Another for the Structured Output Agent to save the structured itinerary and transfer to the Route Timing agent." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [], + "source": [ + "def mark_itinerary_as_complete(final_itinerary: str, context_variables: Dict[str, Any]) -> SwarmResult:\n", + " \"\"\"Store and mark our itinerary as accepted by the customer.\"\"\"\n", + " context_variables[\"itinerary_confirmed\"] = True\n", + " context_variables[\"itinerary\"] = final_itinerary\n", + "\n", + " # This will update the context variables and then transfer to the Structured Output agent\n", + " return SwarmResult(\n", + " agent=\"structured_output_agent\", context_variables=context_variables, values=\"Itinerary recorded and confirmed.\"\n", + " )\n", + "\n", + "\n", + "def create_structured_itinerary(context_variables: Dict[str, Any], structured_itinerary: str) -> SwarmResult:\n", + " \"\"\"Once a structured itinerary is created, store it and pass on to the Route Timing agent.\"\"\"\n", + "\n", + " # Ensure the itinerary is confirmed, if not, back to the Planner agent to confirm it with the customer\n", + " if not context_variables[\"itinerary_confirmed\"]:\n", + " return SwarmResult(\n", + " agent=\"planner_agent\",\n", + " values=\"Itinerary not confirmed, please confirm the itinerary with the customer first.\",\n", + " )\n", + "\n", + " context_variables[\"structured_itinerary\"] = structured_itinerary\n", + "\n", + " # This will update the context variables and then transfer to the Route Timing agent\n", + " return SwarmResult(\n", + " agent=\"route_timing_agent\", context_variables=context_variables, values=\"Structured itinerary stored.\"\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Agents\n", + "\n", + "Our SwarmAgents and a UserProxyAgent (human) which the swarm will interact with." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [], + "source": [ + "# Planner agent, interacting with the customer and GraphRag agent, to create an itinerary\n", + "planner_agent = SwarmAgent(\n", + " name=\"planner_agent\",\n", + " system_message=\"You are a trip planner agent. It is important to know where the customer is going, how many days, what they want to do.\"\n", + " + \"You will work with another agent, graphrag_agent, to get information about restaurant and attractions. \"\n", + " + \"You are also working with the customer, so you must ask the customer what they want to do if you don’t have LOCATION, NUMBER OF DAYS, MEALS, and ATTRACTIONS. \"\n", + " + \"When you have the customer's requirements, work with graphrag_agent to get information for an itinerary.\"\n", + " + \"You are responsible for creating the itinerary and for each day in the itinerary you MUST HAVE events and EACH EVENT MUST HAVE a 'type' ('Restaurant' or 'Attraction'), 'location' (name of restaurant or attraction), 'city', and 'description'. \"\n", + " + \"Finally, YOU MUST ask the customer if they are happy with the itinerary before marking the itinerary as complete.\",\n", + " functions=[mark_itinerary_as_complete],\n", + " llm_config=llm_config,\n", + ")\n", + "\n", + "# FalkorDB GraphRAG agent, utilising the FalkorDB to gather data for the Planner agent\n", + "graphrag_agent = SwarmAgent(\n", + " name=\"graphrag_agent\",\n", + " system_message=\"Return a list of restaurants and/or attractions. List them separately and provide ALL the options in the location. Do not provide travel advice.\",\n", + ")\n", + "\n", + "# Adding the FalkorDB capability to the agent\n", + "graph_rag_capability = FalkorGraphRagCapability(query_engine)\n", + "graph_rag_capability.add_to_agent(graphrag_agent)\n", + "\n", + "# Structured Output agent, formatting the itinerary into a structured format through the response_format on the LLM Configuration\n", + "structured_config_list = copy.deepcopy(config_list)\n", + "for config in structured_config_list:\n", + " config[\"response_format\"] = Itinerary\n", + "\n", + "structured_output_agent = SwarmAgent(\n", + " name=\"structured_output_agent\",\n", + " system_message=\"You are a data formatting agent, format the provided itinerary in the context below into the provided format.\",\n", + " llm_config={\"config_list\": structured_config_list, \"timeout\": 120},\n", + " functions=[create_structured_itinerary],\n", + ")\n", + "\n", + "# Route Timing agent, adding estimated travel times to the itinerary by utilising the Google Maps Platform\n", + "route_timing_agent = SwarmAgent(\n", + " name=\"route_timing_agent\",\n", + " system_message=\"You are a route timing agent. YOU MUST call the update_itinerary_with_travel_times tool if you do not see the exact phrase 'Timed itinerary added to context with travel times' is seen in this conversation. Only after this please tell the customer 'Your itinerary is ready!'.\",\n", + " llm_config=llm_config,\n", + " functions=[update_itinerary_with_travel_times],\n", + ")\n", + "\n", + "# Our customer will be a human in the loop\n", + "customer = UserProxyAgent(name=\"customer\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Hand offs and After works\n", + "\n", + "In conjunction with the agent's associated functions, we establish rules that govern the swarm orchestration through hand offs and After works.\n", + "\n", + "For more details on the swarm orchestration, [see the documentation](https://ag2ai.github.io/ag2/docs/topics/swarm)." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [], + "source": [ + "planner_agent.register_hand_off(\n", + " hand_to=[\n", + " ON_CONDITION(\n", + " graphrag_agent,\n", + " \"Need information on the restaurants and attractions for a location. DO NOT call more than once at a time.\",\n", + " ), # Get info from FalkorDB GraphRAG\n", + " ON_CONDITION(structured_output_agent, \"Itinerary is confirmed by the customer\"),\n", + " AFTER_WORK(AfterWorkOption.REVERT_TO_USER), # Revert to the customer for more information on their plans\n", + " ]\n", + ")\n", + "\n", + "\n", + "# Back to the Planner when information has been retrieved\n", + "graphrag_agent.register_hand_off(hand_to=[AFTER_WORK(planner_agent)])\n", + "\n", + "# Once we have formatted our itinerary, we can hand off to the route timing agent to add in the travel timings\n", + "structured_output_agent.register_hand_off(hand_to=[AFTER_WORK(route_timing_agent)])\n", + "\n", + "# Finally, once the route timing agent has finished, we can terminate the swarm\n", + "route_timing_agent.register_hand_off(\n", + " hand_to=[AFTER_WORK(AfterWorkOption.TERMINATE)] # Once this agent has finished, the swarm can terminate\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Run the swarm\n", + "\n", + "Let's get an itinerary for a couple of days in Rome." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[33mcustomer\u001b[0m (to chat_manager):\n", + "\n", + "I want to go to Rome for a couple of days. Can you help me plan my trip?\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: planner_agent\n", + "\u001b[0m\n", + "\u001b[33mplanner_agent\u001b[0m (to chat_manager):\n", + "\n", + "Certainly! I can help you plan your trip to Rome. For a comprehensive itinerary, could you please tell me how many days you will be staying in Rome and what activities or attractions you're interested in? Additionally, would you like recommendations for meals or specific types of cuisine?\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: customer\n", + "\u001b[0m\n", + "INFO: 172.17.0.1:59006 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:59018 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:59030 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:59046 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:59018 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:59006 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "\u001b[33mcustomer\u001b[0m (to chat_manager):\n", + "\n", + "i will be staying for 4 days and want to see the coliseum and the vatican. I like fettucini arrabiatta con salsicca.\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: planner_agent\n", + "\u001b[0m\n", + "\u001b[33mplanner_agent\u001b[0m (to chat_manager):\n", + "\n", + "\u001b[32m***** Suggested tool call (call_mSZozTJG2pfjomWjQE9JyHJ2): transfer_to_graphrag_agent *****\u001b[0m\n", + "Arguments: \n", + "{}\n", + "\u001b[32m*******************************************************************************************\u001b[0m\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: Tool_Execution\n", + "\u001b[0m\n", + "\u001b[35m\n", + ">>>>>>>> EXECUTING FUNCTION transfer_to_graphrag_agent...\u001b[0m\n", + "\u001b[33mTool_Execution\u001b[0m (to chat_manager):\n", + "\n", + "\u001b[32m***** Response from calling tool (call_mSZozTJG2pfjomWjQE9JyHJ2) *****\u001b[0m\n", + "SwarmAgent --> graphrag_agent\n", + "\u001b[32m**********************************************************************\u001b[0m\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: graphrag_agent\n", + "\u001b[0m\n", + "\u001b[33mgraphrag_agent\u001b[0m (to chat_manager):\n", + "\n", + "**Attractions in Rome:**\n", + "1. Colosseum - An ancient amphitheater known for gladiatorial contests and public spectacles.\n", + "2. Vatican Museums - A complex of museums and galleries showcasing works of art collected by Popes over centuries.\n", + "3. Trevi Fountain - A Baroque fountain known for its stunning sculptures and tradition of tossing coins.\n", + "4. Basilica di Santa Maria Maggiore - A major basilica in Rome, known for its rich history and impressive architecture.\n", + " \n", + "**Restaurants in Rome:**\n", + "1. Trattoria da Enzo - A cozy trattoria known for its traditional Roman dishes and welcoming atmosphere. Rating: 4.5\n", + "2. Il Pagliaccio - An elegant Michelin-starred restaurant offering contemporary Italian cuisine. Rating: 4.8\n", + "3. Tonnarello - Casual stop for pasta, meatballs & other simple Roman dishes, plus patio seating & acoustic guitar. Rating: 4.7\n", + "4. Osteria delle Commari - Local home style Roman restaurant near Vatican Museums. Rating: 4.3\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: planner_agent\n", + "\u001b[0m\n", + "\u001b[33mplanner_agent\u001b[0m (to chat_manager):\n", + "\n", + "Great! I have gathered some information on attractions and restaurants in Rome. Let's create an itinerary for your 4-day stay. \n", + "\n", + "### Day 1:\n", + "- **Attraction**: Colosseum\n", + " - **City**: Rome\n", + " - **Description**: An ancient amphitheater known for gladiatorial contests and public spectacles.\n", + "\n", + "- **Restaurant**: Trattoria da Enzo\n", + " - **City**: Rome\n", + " - **Description**: A cozy trattoria known for its traditional Roman dishes and welcoming atmosphere. Perfect spot to try your favorite Fettucini Arrabiatta con Salsicca.\n", + "\n", + "### Day 2:\n", + "- **Attraction**: Vatican Museums\n", + " - **City**: Rome\n", + " - **Description**: A complex of museums and galleries showcasing works of art collected by Popes over centuries.\n", + "\n", + "- **Restaurant**: Osteria delle Commari\n", + " - **City**: Rome\n", + " - **Description**: Local home-style Roman restaurant near Vatican Museums. Enjoy some local flavors after your museum visit.\n", + "\n", + "### Day 3:\n", + "- **Attraction**: Trevi Fountain\n", + " - **City**: Rome\n", + " - **Description**: A Baroque fountain known for its stunning sculptures and tradition of tossing coins.\n", + "\n", + "- **Restaurant**: Tonnarello\n", + " - **City**: Rome\n", + " - **Description**: Casual stop for pasta, meatballs & other simple Roman dishes, plus patio seating & acoustic guitar. Enjoy a lovely dinner in a lively setting.\n", + "\n", + "### Day 4:\n", + "- **Attraction**: Basilica di Santa Maria Maggiore\n", + " - **City**: Rome\n", + " - **Description**: A major basilica in Rome, known for its rich history and impressive architecture.\n", + "\n", + "- **Restaurant**: Il Pagliaccio\n", + " - **City**: Rome\n", + " - **Description**: An elegant Michelin-starred restaurant offering contemporary Italian cuisine. Perfect to conclude your trip with a refined dining experience.\n", + "\n", + "Please let me know if you are satisfied with this itinerary or if you would like any changes!\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: customer\n", + "\u001b[0m\n", + "INFO: 172.17.0.1:48618 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48630 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48634 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48640 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48630 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48618 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "\u001b[33mcustomer\u001b[0m (to chat_manager):\n", + "\n", + "I'm satisfied with the itinerary.\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: planner_agent\n", + "\u001b[0m\n", + "\u001b[33mplanner_agent\u001b[0m (to chat_manager):\n", + "\n", + "\u001b[32m***** Suggested tool call (call_3Yfto3nOhSTEBjIfcuSELX3x): mark_itinerary_as_complete *****\u001b[0m\n", + "Arguments: \n", + "{\"final_itinerary\": \"### Day 1:\\n- **Attraction**: Colosseum\\n - **City**: Rome\\n - **Description**: An ancient amphitheater known for gladiatorial contests and public spectacles.\\n\\n- **Restaurant**: Trattoria da Enzo\\n - **City**: Rome\\n - **Description**: A cozy trattoria known for its traditional Roman dishes and welcoming atmosphere. Perfect spot to try your favorite Fettucini Arrabiatta con Salsicca.\\n\\n### Day 2:\\n- **Attraction**: Vatican Museums\\n - **City**: Rome\\n - **Description**: A complex of museums and galleries showcasing works of art collected by Popes over centuries.\\n\\n- **Restaurant**: Osteria delle Commari\\n - **City**: Rome\\n - **Description**: Local home-style Roman restaurant near Vatican Museums. Enjoy some local flavors after your museum visit.\\n\\n### Day 3:\\n- **Attraction**: Trevi Fountain\\n - **City**: Rome\\n - **Description**: A Baroque fountain known for its stunning sculptures and tradition of tossing coins.\\n\\n- **Restaurant**: Tonnarello\\n - **City**: Rome\\n - **Description**: Casual stop for pasta, meatballs & other simple Roman dishes, plus patio seating & acoustic guitar. Enjoy a lovely dinner in a lively setting.\\n\\n### Day 4:\\n- **Attraction**: Basilica di Santa Maria Maggiore\\n - **City**: Rome\\n - **Description**: A major basilica in Rome, known for its rich history and impressive architecture.\\n\\n- **Restaurant**: Il Pagliaccio\\n - **City**: Rome\\n - **Description**: An elegant Michelin-starred restaurant offering contemporary Italian cuisine. Perfect to conclude your trip with a refined dining experience.\"}\n", + "\u001b[32m*******************************************************************************************\u001b[0m\n", + "\u001b[32m***** Suggested tool call (call_Mywl90Yz282vBonoVy1N5NuO): transfer_to_structured_output_agent *****\u001b[0m\n", + "Arguments: \n", + "{}\n", + "\u001b[32m****************************************************************************************************\u001b[0m\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: Tool_Execution\n", + "\u001b[0m\n", + "\u001b[35m\n", + ">>>>>>>> EXECUTING FUNCTION mark_itinerary_as_complete...\u001b[0m\n", + "\u001b[35m\n", + ">>>>>>>> EXECUTING FUNCTION transfer_to_structured_output_agent...\u001b[0m\n", + "\u001b[33mTool_Execution\u001b[0m (to chat_manager):\n", + "\n", + "\u001b[32m***** Response from calling tool (call_3Yfto3nOhSTEBjIfcuSELX3x) *****\u001b[0m\n", + "Itinerary recorded and confirmed.\n", + "\u001b[32m**********************************************************************\u001b[0m\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m***** Response from calling tool (call_Mywl90Yz282vBonoVy1N5NuO) *****\u001b[0m\n", + "SwarmAgent --> structured_output_agent\n", + "\u001b[32m**********************************************************************\u001b[0m\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: structured_output_agent\n", + "\u001b[0m\n", + "\u001b[33mstructured_output_agent\u001b[0m (to chat_manager):\n", + "\n", + "{\"days\":[{\"events\":[{\"type\":\"Attraction\",\"location\":\"Colosseum\",\"city\":\"Rome\",\"description\":\"An ancient amphitheater known for gladiatorial contests and public spectacles.\"},{\"type\":\"Restaurant\",\"location\":\"Trattoria da Enzo\",\"city\":\"Rome\",\"description\":\"A cozy trattoria known for its traditional Roman dishes and welcoming atmosphere. Perfect spot to try your favorite Fettucini Arrabiatta con Salsicca.\"}]},{\"events\":[{\"type\":\"Attraction\",\"location\":\"Vatican Museums\",\"city\":\"Rome\",\"description\":\"A complex of museums and galleries showcasing works of art collected by Popes over centuries.\"},{\"type\":\"Restaurant\",\"location\":\"Osteria delle Commari\",\"city\":\"Rome\",\"description\":\"Local home-style Roman restaurant near Vatican Museums. Enjoy some local flavors after your museum visit.\"}]},{\"events\":[{\"type\":\"Attraction\",\"location\":\"Trevi Fountain\",\"city\":\"Rome\",\"description\":\"A Baroque fountain known for its stunning sculptures and tradition of tossing coins.\"},{\"type\":\"Restaurant\",\"location\":\"Tonnarello\",\"city\":\"Rome\",\"description\":\"Casual stop for pasta, meatballs & other simple Roman dishes, plus patio seating & acoustic guitar. Enjoy a lovely dinner in a lively setting.\"}]},{\"events\":[{\"type\":\"Attraction\",\"location\":\"Basilica di Santa Maria Maggiore\",\"city\":\"Rome\",\"description\":\"A major basilica in Rome, known for its rich history and impressive architecture.\"},{\"type\":\"Restaurant\",\"location\":\"Il Pagliaccio\",\"city\":\"Rome\",\"description\":\"An elegant Michelin-starred restaurant offering contemporary Italian cuisine. Perfect to conclude your trip with a refined dining experience.\"}]}]}\n", + "\n", + "--------------------------------------------------------------------------------\n", + "\u001b[32m\n", + "Next speaker: route_timing_agent\n", + "\u001b[0m\n", + "\u001b[33mroute_timing_agent\u001b[0m (to chat_manager):\n", + "\n", + "None\n", + "\n", + "--------------------------------------------------------------------------------\n" + ] + } + ], + "source": [ + "# Start the conversation\n", + "\n", + "chat_result, context_variables, last_agent = initiate_swarm_chat(\n", + " initial_agent=planner_agent,\n", + " agents=[planner_agent, graphrag_agent, structured_output_agent, route_timing_agent],\n", + " user_agent=customer,\n", + " context_variables=trip_context,\n", + " messages=\"I want to go to Rome for a couple of days. Can you help me plan my trip?\",\n", + " after_work=AfterWorkOption.TERMINATE,\n", + " max_rounds=100,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Bonus itinerary output" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "No itinerary available to print.\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40536 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40540 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40542 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40536 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"GET /projects/UHJvamVjdDoy HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"GET /favicon.ico HTTP/1.1\" 304 Not Modified\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:40530 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:47238 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:42890 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:47452 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:43838 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:35308 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:45632 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:45644 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:50204 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:50210 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"GET /projects/UHJvamVjdDoy HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48872 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n", + "INFO: 172.17.0.1:48880 - \"POST /graphql HTTP/1.1\" 200 OK\n" + ] + } + ], + "source": [ + "def print_itinerary(itinerary_data):\n", + " header = \"β–ˆ β–ˆ\\n β–ˆ β–ˆ \\n β–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆ \\n β–ˆβ–ˆ β–ˆβ–ˆ \\n β–ˆ β–ˆ \\n β–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆ \\n β–ˆ β–ˆβ–ˆ β–ˆβ–ˆβ–ˆ β–ˆβ–ˆ β–ˆ \\n β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ \\n\\n β–ˆβ–ˆ β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆ \\nβ–ˆ β–ˆ β–ˆ β–ˆ \\nβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆ β–ˆβ–ˆ β–ˆβ–ˆ \\nβ–ˆ β–ˆ β–ˆ β–ˆ β–ˆ \\nβ–ˆ β–ˆ β–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ \\n\"\n", + " width = 80\n", + " icons = {\"Travel\": \"🚢\", \"Restaurant\": \"🍽️\", \"Attraction\": \"πŸ›οΈ\"}\n", + "\n", + " for line in header.split(\"\\n\"):\n", + " print(line.center(width))\n", + " print(f\"Itinerary for {itinerary_data['days'][0]['events'][0]['city']}\".center(width))\n", + " print(\"=\" * width)\n", + "\n", + " for day_num, day in enumerate(itinerary_data[\"days\"], 1):\n", + " print(f\"\\nDay {day_num}\".center(width))\n", + " print(\"-\" * width)\n", + "\n", + " for event in day[\"events\"]:\n", + " event_type = event[\"type\"]\n", + " print(f\"\\n {icons[event_type]} {event['location']}\")\n", + " if event_type != \"Travel\":\n", + " words = event[\"description\"].split()\n", + " line = \" \"\n", + " for word in words:\n", + " if len(line) + len(word) + 1 <= 76:\n", + " line += word + \" \"\n", + " else:\n", + " print(line)\n", + " line = \" \" + word + \" \"\n", + " if line.strip():\n", + " print(line)\n", + " else:\n", + " print(f\" {event['description']}\")\n", + " print(\"\\n\" + \"-\" * width)\n", + "\n", + "\n", + "if \"timed_itinerary\" in context_variables:\n", + " print_itinerary(context_variables[\"timed_itinerary\"])\n", + "else:\n", + " print(\"No itinerary available to print.\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "front_matter": { + "description": "FalkorDB GraphRAG utilises a knowledge graph and can be added as a capability to agents. Together with a swarm orchestration of agents is highly effective at providing a RAG capability.", + "tags": [ + "RAG", + "tool/function", + "swarm" + ] + }, + "kernelspec": { + "display_name": ".venv", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.8" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +}