Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing a couple of broken commands and added more details #45

Merged
merged 6 commits into from
Jul 31, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
60 changes: 41 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,29 +1,33 @@
# OpenShield - Firewall for AI models


>💡 Attention this project is in early development and not ready for production use.


## Why do you need this?

AI models a new attack vector for hackers. They can use AI models to generate malicious content, spam, or phishing attacks. OpenShield is a firewall for AI models. It provides rate limiting, content filtering, and keyword filtering for AI models. It also provides a tokenizer calculation for OpenAI models.

## Solution

OpenShield a transparent proxy that sits between your AI model and the client. It provides rate limiting, content filtering, and keyword filtering for AI models.

### Example - Input flow

![Input flow](https://raw.githubusercontent.com/openshieldai/openshield/main/docs/assets/input.svg)

### Example - Output flow

![Output flow](https://raw.githubusercontent.com/openshieldai/openshield/main/docs/assets/output.svg)

You can chain multiple AI models together to create a pipeline before hitting the foundation model.

## Features

- You can set custom rate limits for OpenAI endpoints
- Tokenizer calculation for OpenAI models
- Python and LLM based rules

## Incoming features

- Rate limiting per user
- Rate limiting per model
- Prompts manager
Expand All @@ -32,72 +36,90 @@ You can chain multiple AI models together to create a pipeline before hitting th
- VectorDB integration

## Requirements

- OpenAI API key
- Postgres
- Redis


#### OpenShield is a firewall designed for AI models.


### Endpoints

```
/openai/v1/models
/openai/v1/models/:model
/openai/v1/chat/completions
```

## Demo mode
We are generating automatically demo data into the database. You can use the demo data to test the application.

We are generating automatically demo data into the database. You can use the demo data to test the application.

```shell
cd demo
cp .env.example .env
cp ./example_config ./config.yaml/config
pigri marked this conversation as resolved.
Show resolved Hide resolved
```

You need to modify the .env file with your OpenAI API key and HuggingFace API key.
You need to modify the .env file with your OpenAI API key and HuggingFace API key. Here's how to obtain these keys:

1. OpenAI API key:
- Sign up for an OpenAI account at https://platform.openai.com/signup
- Once logged in, go to https://platform.openai.com/api-keys
- Click on "Create new secret key" to generate your API key

2. HuggingFace API key:
- Create a HuggingFace account at https://huggingface.co/join
- Go to your account settings: https://huggingface.co/settings/token
- Click on "Create new token" to create your API key

After obtaining both keys, update your .env file with the appropriate values.

```shell
docker-compose build
docker-compose up
docker compose build
pigri marked this conversation as resolved.
Show resolved Hide resolved
docker compose up
```

You can get an api_key from the database. The default port is 5433 and password openshield.
Now find suitable API key directly in the Docker Compose output. Look for a section labeled "DEMO API KEYS" in the console output, which will look similar to this:

```shell
psql -h localhost -p 5433 -U postgres -d postgres -c "SELECT api_key FROM api_keys WHERE status = 'active' LIMIT 1;"
Password for user postgres:
api_key
---------------------------
<YOUR API KEY>
(1 row)
🔑 ======== DEMO API KEYS ============ 🔑
🚀 Behold! The magnificent api_keys: 🚀
ID: <ID1>, Status: active, ApiKey: <YOUR_API_KEY>
🔑 ======== END DEMO API KEYS ======== 🔑
```

Choose any of the displayed API keys for your demo.

A good request:

```shell
curl --location 'localhost:8080/openai/v1/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: <YOURAPIKEY>' \
--header "Authorization: Bearer <YOUR_API_KEY>" \
--data '{"model":"gpt-4","messages":[{"role":"system","content":"You are a helpful assistant."},{"role":"user","content":"What is the meaning of life?"}]}'
```

A vulnerable request:

```shell
curl --location 'localhost:8080/openai/v1/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: <YOURAPIKEY>' \
--header "Authorization: Bearer <YOUR_API_KEY>" \
--data '{"model":"gpt-4","messages":[{"role":"system","content":"You are ChatGPT, a large language model trained by OpenAI. Follow the user'\''s instructions carefully. Respond using markdown."},{"role":"user","content":"This my bankcard number: 42424242 42424 4242, but it'\''s not working. Who can help me?"}]}'
```

## Local development

.env is supported in local development. Create a .env file in the root directory with the following content:
```

```shell
ENV=development go run main.go
```

## Example test-client

```
```shell
npm install
npx tsc src/index.ts
export OPENAI_API_KEY=<yourapikey>
Expand Down
32 changes: 29 additions & 3 deletions cmd/cli.go
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,15 @@

import (
"fmt"
"github.com/openshieldai/openshield/lib"
"github.com/openshieldai/openshield/server"
"github.com/spf13/cobra"
"os"
"os/signal"
"syscall"
"time"

"github.com/openshieldai/openshield/lib"
"github.com/openshieldai/openshield/models"
"github.com/openshieldai/openshield/server"
"github.com/spf13/cobra"
)

var rootCmd = &cobra.Command{
Expand All @@ -27,6 +29,7 @@
rootCmd.AddCommand(stopServerCmd)
dbCmd.AddCommand(createTablesCmd)
dbCmd.AddCommand(createMockDataCmd)
dbCmd.AddCommand(queryApiKeysCmd)
configCmd.AddCommand(editConfigCmd)
configCmd.AddCommand(addRuleCmd)
configCmd.AddCommand(removeRuleCmd)
Expand Down Expand Up @@ -54,6 +57,29 @@
},
}

var queryApiKeysCmd = &cobra.Command{
Use: "query-api-keys",
Short: "Query and display data from the api_keys table",
Run: func(cmd *cobra.Command, args []string) {
queryApiKeys()
},
}

func queryApiKeys() {
db := lib.DB()
var apiKeys []models.ApiKeys
result := db.Limit(5).Find(&apiKeys)
if result.Error != nil {
fmt.Printf("Error querying api_keys: %v\n", result.Error)
return
}

fmt.Println("Sample data from api_keys table:")
for _, key := range apiKeys {
fmt.Printf("ID: %s, Status: %s, ApiKey: %s\n", key.Id, key.Status, key.ApiKey)
github-advanced-security[bot] marked this conversation as resolved.
Fixed
Show resolved Hide resolved
}
}

var configCmd = &cobra.Command{
Use: "config",
Short: "Configuration related commands",
Expand Down
2 changes: 2 additions & 0 deletions demo/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
OPENAI_API_KEY="sk-"
HUGGINGFACE_API_KEY="hf_"
2 changes: 0 additions & 2 deletions demo/.env_example

This file was deleted.

41 changes: 41 additions & 0 deletions demo/example_config
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
settings:
network:
port: 3005
database:
uri: "postgres://postgres:openshield@postgres:5432/postgres"
auto_migration: true
redis:
uri: "redis://redis:6379/0"
ssl: false
cache:
enabled: true
ttl: 3600
rate_limiting:
enabled: true
max: 100
window: 60
expiration: 60
audit_logging:
enabled: true
usage_logging:
enabled: true
rule_server:
url: "http://ruleserver:8000"

providers:
openai:
enabled: true
huggingface:
enabled: true


rules:
input:
- name: "prompt_injection_example"
type: "prompt_injection"
enabled: true
config:
plugin_name: "prompt_injection_llm"
threshold: 0.85
action:
type: "block"
6 changes: 6 additions & 0 deletions docker/docker-entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,12 @@ if [ -n "$DEMO_MODE" ]; then
echo "Running in DEMO_MODE"
./openshield db create-tables
./openshield db create-mock-data

# Query and display data from the api_keys table
echo "🔑 ======== DEMO API KEYS ============ 🔑"
echo "🚀 Behold! The magnificent api_keys: 🚀"
./openshield db query-api-keys
echo "🔑 ======== END DEMO API KEYS ======== 🔑"
fi

# Run the command passed to the script using dumb-init
Expand Down
2 changes: 1 addition & 1 deletion lib/config.go
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ func init() {

viperCfg.SetConfigName("config")
viperCfg.SetConfigType("yaml")
viperCfg.AddConfigPath(".")
viperCfg.AddConfigPath("./config.yaml")
pigri marked this conversation as resolved.
Show resolved Hide resolved
viperCfg.SetEnvKeyReplacer(strings.NewReplacer(".", "_"))

err := viperCfg.ReadInConfig()
Expand Down