Skip to content

Commit

Permalink
Merge pull request #71 from hillct/add-docker-support
Browse files Browse the repository at this point in the history
fix: further enhance Docker and docker-comose support with staged, --target and --profile support, plus Coolify Deployment Support
  • Loading branch information
coleam00 authored Oct 31, 2024
2 parents 3cbe207 + c116338 commit 349c5d5
Show file tree
Hide file tree
Showing 6 changed files with 222 additions and 51 deletions.
115 changes: 109 additions & 6 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ First off, thank you for considering contributing to Bolt.new! This fork aims to
- [Pull Request Guidelines](#pull-request-guidelines)
- [Coding Standards](#coding-standards)
- [Development Setup](#development-setup)
- [Deploymnt with Docker](#docker-deployment-documentation)
- [Project Structure](#project-structure)

## Code of Conduct
Expand Down Expand Up @@ -88,11 +89,113 @@ pnpm run dev

**Note**: You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway.

## Questions?
## Testing

For any questions about contributing, please:
1. Check existing documentation
2. Search through issues
3. Create a new issue with the question label
Run the test suite with:

Thank you for contributing to Bolt.new! 🚀
```bash
pnpm test
```

## Deployment

To deploy the application to Cloudflare Pages:

```bash
pnpm run deploy
```

Make sure you have the necessary permissions and Wrangler is correctly configured for your Cloudflare account.

# Docker Deployment Documentation

This guide outlines various methods for building and deploying the application using Docker.

## Build Methods

### 1. Using Helper Scripts

NPM scripts are provided for convenient building:

```bash
# Development build
npm run dockerbuild

# Production build
npm run dockerbuild:prod
```

### 2. Direct Docker Build Commands

You can use Docker's target feature to specify the build environment:

```bash
# Development build
docker build . --target bolt-ai-development

# Production build
docker build . --target bolt-ai-production
```

### 3. Docker Compose with Profiles

Use Docker Compose profiles to manage different environments:

```bash
# Development environment
docker-compose --profile development up

# Production environment
docker-compose --profile production up
```

## Running the Application

After building using any of the methods above, run the container with:

```bash
# Development
docker run -p 5173:5173 --env-file .env.local bolt-ai:development

# Production
docker run -p 5173:5173 --env-file .env.local bolt-ai:production
```

## Deployment with Coolify

[Coolify](https://github.com/coollabsio/coolify) provides a straightforward deployment process:

1. Import your Git repository as a new project
2. Select your target environment (development/production)
3. Choose "Docker Compose" as the Build Pack
4. Configure deployment domains
5. Set the custom start command:
```bash
docker compose --profile production up
```
6. Configure environment variables
- Add necessary AI API keys
- Adjust other environment variables as needed
7. Deploy the application

## VS Code Integration

The `docker-compose.yaml` configuration is compatible with VS Code dev containers:

1. Open the command palette in VS Code
2. Select the dev container configuration
3. Choose the "development" profile from the context menu

## Environment Files

Ensure you have the appropriate `.env.local` file configured before running the containers. This file should contain:
- API keys
- Environment-specific configurations
- Other required environment variables

## Notes

- Port 5173 is exposed and mapped for both development and production environments
- Environment variables are loaded from `.env.local`
- Different profiles (development/production) can be used for different deployment scenarios
- The configuration supports both local development and production deployment
76 changes: 57 additions & 19 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,29 +1,67 @@
# Use an official Node.js runtime as the base image
FROM node:20.15.1
ARG BASE=node:20.18.0
FROM ${BASE} AS base

# Set the working directory in the container
WORKDIR /app

# Install pnpm
RUN npm install -g [email protected]
# Install dependencies (this step is cached as long as the dependencies don't change)
COPY package.json pnpm-lock.yaml ./

# Copy package.json and pnpm-lock.yaml (if available)
COPY package.json pnpm-lock.yaml* ./
RUN corepack enable pnpm && pnpm install

# Install dependencies
RUN pnpm install

# Copy the rest of the application code
# Copy the rest of your app's source code
COPY . .

# Build the application
RUN pnpm run build
# Expose the port the app runs on
EXPOSE 5173

# Production image
FROM base AS bolt-ai-production

# Define environment variables with default values or let them be overridden
ARG GROQ_API_KEY
ARG OPENAI_API_KEY
ARG ANTHROPIC_API_KEY
ARG OPEN_ROUTER_API_KEY
ARG GOOGLE_GENERATIVE_AI_API_KEY
ARG OLLAMA_API_BASE_URL
ARG VITE_LOG_LEVEL=debug

ENV WRANGLER_SEND_METRICS=false \
GROQ_API_KEY=${GROQ_API_KEY} \
OPENAI_API_KEY=${OPENAI_API_KEY} \
ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY} \
OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY} \
GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY} \
OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL} \
VITE_LOG_LEVEL=${VITE_LOG_LEVEL}

# Pre-configure wrangler to disable metrics
RUN mkdir -p /root/.config/.wrangler && \
echo '{"enabled":false}' > /root/.config/.wrangler/metrics.json

RUN npm run build

CMD [ "pnpm", "run", "dockerstart"]

# Development image
FROM base AS bolt-ai-development

# Make sure bindings.sh is executable
RUN chmod +x bindings.sh
# Define the same environment variables for development
ARG GROQ_API_KEY
ARG OPENAI_API_KEY
ARG ANTHROPIC_API_KEY
ARG OPEN_ROUTER_API_KEY
ARG GOOGLE_GENERATIVE_AI_API_KEY
ARG OLLAMA_API_BASE_URL
ARG VITE_LOG_LEVEL=debug

# Expose the port the app runs on (adjust if you specified a different port)
EXPOSE 3000
ENV GROQ_API_KEY=${GROQ_API_KEY} \
OPENAI_API_KEY=${OPENAI_API_KEY} \
ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY} \
OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY} \
GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY} \
OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL} \
VITE_LOG_LEVEL=${VITE_LOG_LEVEL}

# Start the application
CMD ["pnpm", "run", "start"]
RUN mkdir -p ${WORKDIR}/run
CMD pnpm run dev --host
48 changes: 48 additions & 0 deletions docker-compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
services:
bolt-ai:
image: bolt-ai:production
build:
context: .
dockerfile: Dockerfile
target: bolt-ai-production
ports:
- "5173:5173"
env_file: ".env.local"
environment:
- NODE_ENV=production
- COMPOSE_PROFILES=production
# No strictly neded but serving as hints for Coolify
- PORT=5173
- GROQ_API_KEY=${GROQ_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY}
- GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY}
- OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL}
- VITE_LOG_LEVEL=${VITE_LOG_LEVEL:-debug}
command: pnpm run dockerstart
profiles:
- production # This service only runs in the production profile

bolt-ai-dev:
image: bolt-ai:development
build:
target: bolt-ai-development
environment:
- NODE_ENV=development
- COMPOSE_PROFILES=development
- PORT=5173
- GROQ_API_KEY=${GROQ_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY}
- GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY}
- OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL}
- VITE_LOG_LEVEL=${VITE_LOG_LEVEL:-debug}
volumes:
- .:/app
- /app/node_modules
ports:
- "5173:5173" # Same port, no conflict as only one runs at a time
command: pnpm run dev --host 0.0.0.0
profiles: ["development", "default"] # Make development the default profile
24 changes: 0 additions & 24 deletions docker-compose.yml

This file was deleted.

9 changes: 7 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,11 @@
"test:watch": "vitest",
"lint": "eslint --cache --cache-location ./node_modules/.cache/eslint .",
"lint:fix": "npm run lint -- --fix",
"start": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings --ip 0.0.0.0 --port 3000",
"start": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings",
"dockerstart": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings --ip 0.0.0.0 --port 5173 --no-show-interactive-dev-session",
"dockerrun": "docker run -it -d --name bolt-ai-live -p 5173:5173 --env-file .env.local bolt-ai",
"dockerbuild:prod": "docker build -t bolt-ai:production bolt-ai:latest --target bolt-ai-production .",
"dockerbuild": "docker build -t bolt-ai:development -t bolt-ai:latest --target bolt-ai-development .",
"typecheck": "tsc",
"typegen": "wrangler types",
"preview": "pnpm run build && pnpm run start"
Expand Down Expand Up @@ -110,5 +114,6 @@
},
"resolutions": {
"@typescript-eslint/utils": "^8.0.0-alpha.30"
}
},
"packageManager": "[email protected]+sha512.22721b3a11f81661ae1ec68ce1a7b879425a1ca5b991c975b074ac220b187ce56c708fe5db69f4c962c989452eee76c82877f4ee80f474cebd61ee13461b6228"
}
1 change: 1 addition & 0 deletions wrangler.toml
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,4 @@ name = "bolt"
compatibility_flags = ["nodejs_compat"]
compatibility_date = "2024-07-01"
pages_build_output_dir = "./build/client"
send_metrics = false

0 comments on commit 349c5d5

Please sign in to comment.