Skip to content

Commit

Permalink
Fixing a couple of broken commands and added more details (#45)
Browse files Browse the repository at this point in the history
  • Loading branch information
pigri authored Jul 31, 2024
2 parents cd0c386 + 409ac1c commit 73b0c53
Show file tree
Hide file tree
Showing 7 changed files with 139 additions and 30 deletions.
65 changes: 48 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@

## Why do you need this?

AI models a new attack vector for hackers. They can use AI models to generate malicious content, spam, or phishing attacks. OpenShield is a firewall for AI models. It provides rate limiting, content filtering, and keyword filtering for AI models. It also provides a tokenizer calculation for OpenAI models.

## [OWASP Top 10 LLM attacks](https://owasp.org/www-project-top-10-for-large-language-model-applications/assets/PDF/OWASP-Top-10-for-LLMs-2023-v1_1.pdf):
Expand All @@ -32,22 +33,27 @@ AI models a new attack vector for hackers. They can use AI models to generate ma
- LLM10: Model Theft Unauthorized access to proprietary large language models risks theft, competitive advantage, and dissemination of sensitive information.

## Solution

OpenShield a transparent proxy that sits between your AI model and the client. It provides rate limiting, content filtering, and keyword filtering for AI models.

### Example - Input flow

![Input flow](https://raw.githubusercontent.com/openshieldai/openshield/main/docs/assets/input.svg)

### Example - Output flow

![Output flow](https://raw.githubusercontent.com/openshieldai/openshield/main/docs/assets/output.svg)

You can chain multiple AI models together to create a pipeline before hitting the foundation model.

## Features

- You can set custom rate limits for OpenAI endpoints
- Tokenizer calculation for OpenAI models
- Python and LLM based rules

## Incoming features

- Rate limiting per user
- Rate limiting per model
- Prompts manager
Expand All @@ -56,22 +62,23 @@ You can chain multiple AI models together to create a pipeline before hitting th
- VectorDB integration

## Requirements

- OpenAI API key
- Postgres
- Redis


#### OpenShield is a firewall designed for AI models.


### Endpoints

```
/openai/v1/models
/openai/v1/models/:model
/openai/v1/chat/completions
```

## Demo mode

We are generating automatically demo data into the database. You can use the demo data to test the application.

Adminer is available on port 8085. You can use it to see the database content.
Expand All @@ -81,50 +88,74 @@ cd demo
cp .env.example .env
```

You need to modify the .env file with your OpenAI API key and HuggingFace API key.
You need to modify the .env file with your OpenAI API key and HuggingFace API key. Here's how to obtain these keys:

1. OpenAI API key:
- Sign up for an OpenAI account at https://platform.openai.com/signup
- Once logged in, go to https://platform.openai.com/api-keys
- Click on "Create new secret key" to generate your API key

2. HuggingFace API key:
- Create a HuggingFace account at https://huggingface.co/join
- Go to your account settings: https://huggingface.co/settings/token
- Click on "Create new token" to create your API key

After obtaining both keys, update your .env file with the appropriate values.

```shell
docker-compose build
docker-compose up
docker compose build
docker compose up
```

You can get an api_key from the database. The default port is 5433 and password openshield.
Now find suitable API key directly in the Docker Compose output. Look for a section labeled "CREATED API KEY" in the console output, which will look similar to this:

```shell
psql -h localhost -p 5433 -U postgres -d postgres -c "SELECT api_key FROM api_keys WHERE status = 'active' LIMIT 1;"
Password for user postgres:
api_key
---------------------------
<YOUR API KEY>
(1 row)
==================================================
🔑 CREATED API KEY 🔑
==================================================
------------------------------
| API Key Details |
------------------------------
| ProductID : 1 |
| Status : active |
| ApiKey : <YOUR_API_KEY>|
------------------------------
==================================================
```

Choose any of the displayed API keys for your demo.

A good request:

```shell
curl --location 'localhost:8080/openai/v1/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: <YOURAPIKEY>' \
--header "Authorization: Bearer <YOUR_API_KEY>" \
--data '{"model":"gpt-4","messages":[{"role":"system","content":"You are a helpful assistant."},{"role":"user","content":"What is the meaning of life?"}]}'
```

A vulnerable request:

```shell
curl --location 'localhost:8080/openai/v1/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: <YOURAPIKEY>' \
--header "Authorization: Bearer <YOUR_API_KEY>" \
--data '{"model":"gpt-4","messages":[{"role":"system","content":"You are ChatGPT, a large language model trained by OpenAI. Follow the user'\''s instructions carefully. Respond using markdown."},{"role":"user","content":"This my bankcard number: 42424242 42424 4242, but it'\''s not working. Who can help me?"}]}'
```

## Local development

.env is supported in local development. Create a .env file in the root directory with the following content:
```

```shell
ENV=development go run main.go
```

## Example test-client

```
```shell
npm install
npx tsc src/index.ts
export OPENAI_API_KEY=<yourapikey>
node src/index.js
```
```
7 changes: 4 additions & 3 deletions cmd/cli.go
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,14 @@ package cmd

import (
"fmt"
"github.com/openshieldai/openshield/lib"
"github.com/openshieldai/openshield/server"
"github.com/spf13/cobra"
"os"
"os/signal"
"syscall"
"time"

"github.com/openshieldai/openshield/lib"
"github.com/openshieldai/openshield/server"
"github.com/spf13/cobra"
)

var rootCmd = &cobra.Command{
Expand Down
38 changes: 33 additions & 5 deletions cmd/mock_data.go
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,14 @@ package cmd

import (
"fmt"
"math/rand"
"reflect"
"strings"

"github.com/go-faker/faker/v4"
"github.com/openshieldai/openshield/lib"
"github.com/openshieldai/openshield/models"
"gorm.io/gorm"
"math/rand"
"reflect"
"strings"
)

var generatedTags []string
Expand Down Expand Up @@ -38,11 +39,33 @@ func createMockData() {
db := lib.DB()
createMockTags(db, 10)
createMockRecords(db, &models.AiModels{}, 1)
createMockRecords(db, &models.ApiKeys{}, 1)
apiKey := createMockRecords(db, &models.ApiKeys{}, 1)
createMockRecords(db, &models.AuditLogs{}, 1)
createMockRecords(db, &models.Products{}, 1)
createMockRecords(db, &models.Usage{}, 1)
createMockRecords(db, &models.Workspaces{}, 1)

// Print the created ApiKey with more visibility
if apiKey != nil {
fmt.Println("\n" + strings.Repeat("=", 50))
fmt.Println("🔑 CREATED API KEY 🔑")
fmt.Println(strings.Repeat("=", 50))
fmt.Printf("%s\n", strings.Repeat("-", 30))
fmt.Printf("| %-26s |\n", "API Key Details")
fmt.Printf("%s\n", strings.Repeat("-", 30))
v := reflect.ValueOf(apiKey).Elem()
fieldsToShow := []string{"ProductID", "Status", "ApiKey"}
for _, fieldName := range fieldsToShow {
field := v.FieldByName(fieldName)
if field.IsValid() {
fmt.Printf("| %-12s: %-11v |\n", fieldName, field.Interface())
}
}
fmt.Printf("%s\n", strings.Repeat("-", 30))
fmt.Println(strings.Repeat("=", 50) + "\n")
} else {
fmt.Println("\n❌ No ApiKey was created. ❌")
}
}

func createMockTags(db *gorm.DB, count int) {
Expand Down Expand Up @@ -79,7 +102,8 @@ func getRandomTags() string {
return strings.Join(selectedTags, ",")
}

func createMockRecords(db *gorm.DB, model interface{}, count int) {
func createMockRecords(db *gorm.DB, model interface{}, count int) interface{} {
var createdModel interface{}
for i := 0; i < count; i++ {
newModel := reflect.New(reflect.TypeOf(model).Elem()).Interface()
if err := faker.FakeData(newModel); err != nil {
Expand All @@ -97,9 +121,13 @@ func createMockRecords(db *gorm.DB, model interface{}, count int) {
result := db.Create(newModel)
if result.Error != nil {
fmt.Printf("error inserting fake data for %T: %v\n", newModel, result.Error)
} else {
createdModel = newModel
}
}
return createdModel
}

func setValueOfObject(obj interface{}, fieldName string, value interface{}) {
field := reflect.ValueOf(obj).Elem().FieldByName(fieldName)
if field.IsValid() && field.CanSet() {
Expand Down
2 changes: 2 additions & 0 deletions demo/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
OPENAI_API_KEY="sk-"
HUGGINGFACE_API_KEY="hf_"
2 changes: 0 additions & 2 deletions demo/.env_example

This file was deleted.

41 changes: 41 additions & 0 deletions demo/demo_config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
settings:
network:
port: 3005
database:
uri: "postgres://postgres:openshield@postgres:5432/postgres"
auto_migration: true
redis:
uri: "redis://redis:6379/0"
ssl: false
cache:
enabled: true
ttl: 3600
rate_limiting:
enabled: true
max: 100
window: 60
expiration: 60
audit_logging:
enabled: true
usage_logging:
enabled: true
rule_server:
url: "http://ruleserver:8000"

providers:
openai:
enabled: true
huggingface:
enabled: true


rules:
input:
- name: "prompt_injection_example"
type: "prompt_injection"
enabled: true
config:
plugin_name: "prompt_injection_llm"
threshold: 0.85
action:
type: "block"
14 changes: 11 additions & 3 deletions demo/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ services:
environment:
- ENV=production
- PORT="3005"
- DEMO_MODE="true"
- DEMO_MODE=true
- OPENAI_API_KEY=${OPENAI_API_KEY}
- HUGGINGFACE_API_KEY=${HUGGINGFACE_API_KEY}
ports:
Expand All @@ -15,7 +15,10 @@ services:
- redis
- postgres
volumes:
- ./config.yaml:/app/config.yaml
- ./demo_config.yaml:/app/config.yaml
depends_on:
postgres:
condition: service_healthy

ruleserver:
build:
Expand All @@ -35,9 +38,14 @@ services:
POSTGRES_PASSWORD: openshield
ports:
- "5433:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 2s
timeout: 2s
retries: 5

adminer:
image: adminer
restart: always
ports:
- 8085:8080
- 8085:8080

0 comments on commit 73b0c53

Please sign in to comment.