diff --git a/docs/llms.mdx b/docs/llms.mdx index e3b4720db..98897dda4 100644 --- a/docs/llms.mdx +++ b/docs/llms.mdx @@ -7,11 +7,11 @@ The generated code is then executed to produce the result. [![Choose the LLM](https://cdn.loom.com/sessions/thumbnails/5496c9c07ee04f69bfef1bc2359cd591-00001.jpg)](https://www.loom.com/share/5496c9c07ee04f69bfef1bc2359cd591 "Choose the LLM") -You can either choose a LLM by instantiating one and passing it to the `SmartDataFrame` or `SmartDatalake` constructor, -or you can specify one in the `pandasai.json` file. +You can either choose a LLM by either instantiating it and passing it to the `SmartDataFrame` or `SmartDatalake` constructor, +or by specifying it in the `pandasai.json` configuration file. If the model expects one or more parameters, you can pass them to the constructor or specify them in the `pandasai.json` -file, in the `llm_options` param, as it follows: +file, in the `llm_options` parameters, Here’s an example of how to structure your `pandasai.json` file: ```json { @@ -21,6 +21,24 @@ file, in the `llm_options` param, as it follows: } } ``` +> **Note:** +> `pandasai.json` can be configure for any LLM. + +## Working with pandasai.json file + +In this example, `data.csv` is your data file, and pandasai.json is the configuration file. Make sure the configuration file is named `pandasai.json` and is in the same folder as your code. + +```python +from pandasai import SmartDataframe +from pandasai.config import load_config_from_json + +# Load configuration from pandasai.json +config = load_config_from_json() + +df = SmartDataframe("data.csv", config=config) +response = df.chat("give me revenue of Top 5 companies for year 2021") +print(response) +``` ## BambooLLM