Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Automated Preset Docker Image Building #56

Closed
wants to merge 20 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
120 changes: 120 additions & 0 deletions .github/workflows/preset-image-build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,120 @@
name: Build and Push Preset Models

on:
push:
branches:
- main
- Ishaan/auto-image-build
paths:
- 'pkg/presets/falcon/**'
- 'pkg/presets/llama-2/**'
- 'pkg/presets/llama-2-chat/**'
workflow_dispatch:
inputs:
release:
description: 'Release (yes/no)'
required: true
default: 'no'
image_tag:
description: 'Image Tag'
required: false

permissions:
id-token: write
contents: read

jobs:
setup:
runs-on: ubuntu-20.04
outputs:
image_tag: ${{ steps.set_tag.outputs.image_tag }}
steps:
- name: Checkout
uses: actions/checkout@v2

- name: Install Azure CLI latest
run: |
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash

- uses: azure/login@v1
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
allow-no-subscriptions: true

- name: Set Image Tag
id: set_tag
run: |
if [[ "${{ github.event_name }}" == "workflow_dispatch" && -n "${{ github.event.inputs.image_tag }}" ]]; then
echo "::set-output name=image_tag::${{ github.event.inputs.image_tag }}"
else
echo "::set-output name=image_tag::$(git rev-parse --short HEAD)"
fi

- name: 'Login to ACR'
run: az acr login --name aimodelsregistry

falcon:
needs: setup
runs-on: ubuntu-20.04
if: contains(github.event.commits[0].modified, 'pkg/presets/falcon/')
steps:
- name: Build and push Falcon model
run: |
cd docker/presets/falcon
az acr build -t aimodelsregistry.azurecr.io/falcon:${{ needs.setup.outputs.image_tag }} -r aimodelsregistry .

llama-2-7b:
needs: setup
runs-on: ubuntu-20.04
if: contains(github.event.commits[0].modified, 'pkg/presets/llama-2/')
steps:
- name: Build and push Llama model
run: |
az acr build --build-arg LLAMA_VERSION=llama-2-7b --build-arg SRC_DIR=pkg/presets/llama-2 -t aimodelsregistry.azurecr.io/llama-2-7b:${{ needs.setup.outputs.image_tag }} -r aimodelsregistry .

llama-2-13b:
needs: setup
runs-on: ubuntu-20.04
if: contains(github.event.commits[0].modified, 'pkg/presets/llama-2/')
steps:
- name: Build and push Llama model
run: |
az acr build --build-arg LLAMA_VERSION=llama-2-13b --build-arg SRC_DIR=pkg/presets/llama-2 -t aimodelsregistry.azurecr.io/llama-2-13b:${{ needs.setup.outputs.image_tag }} -r aimodelsregistry .

llama-2-70b:
needs: setup
runs-on: ubuntu-20.04
if: contains(github.event.commits[0].modified, 'pkg/presets/llama-2/')
steps:
- name: Build and push Llama model
run: |
az acr build --build-arg LLAMA_VERSION=llama-2-70b --build-arg SRC_DIR=pkg/presets/llama-2 -t aimodelsregistry.azurecr.io/llama-2-70b:${{ needs.setup.outputs.image_tag }} -r aimodelsregistry .

llama-2-7b-chat:
needs: setup
runs-on: ubuntu-20.04
if: contains(github.event.commits[0].modified, 'pkg/presets/llama-2-chat/')
steps:
- name: Build and push Llama chat model
run: |
az acr build --build-arg LLAMA_VERSION=llama-2-7b-chat --build-arg SRC_DIR=pkg/presets/llama-2-chat -t aimodelsregistry.azurecr.io/llama-2-7b-chat:${{ needs.setup.outputs.image_tag }} -r aimodelsregistry .

llama-2-13b-chat:
needs: setup
runs-on: ubuntu-20.04
if: contains(github.event.commits[0].modified, 'pkg/presets/llama-2-chat/')
steps:
- name: Build and push Llama chat model
run: |
az acr build --build-arg LLAMA_VERSION=llama-2-13b-chat --build-arg SRC_DIR=pkg/presets/llama-2-chat -t aimodelsregistry.azurecr.io/llama-2-13b-chat:${{ needs.setup.outputs.image_tag }} -r aimodelsregistry .

llama-2-70b-chat:
needs: setup
runs-on: ubuntu-20.04
if: contains(github.event.commits[0].modified, 'pkg/presets/llama-2-chat/')
steps:
- name: Build and push Llama chat model
run: |
az acr build --build-arg LLAMA_VERSION=llama-2-70b-chat --build-arg SRC_DIR=pkg/presets/llama-2-chat -t aimodelsregistry.azurecr.io/llama-2-70b-chat:${{ needs.setup.outputs.image_tag }} -r aimodelsregistry .

2 changes: 1 addition & 1 deletion docker/presets/falcon/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
FROM nvcr.io/nvidia/pytorch:23.06-py3

# Set the working directory
WORKDIR /workspace/huggingface
WORKDIR /workspace/falcon

# First, copy just the requirements.txt file and install dependencies
# This is done before copying the code to utilize Docker's layer caching and
Expand Down
12 changes: 0 additions & 12 deletions docker/presets/llama-2-chat/Dockerfile

This file was deleted.

35 changes: 34 additions & 1 deletion docker/presets/llama-2/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,3 +1,15 @@
# Build text completion model
# docker build \
# --build-arg LLAMA_VERSION=llama-2-7b \
# --build-arg SRC_DIR=pkg/presets/llama-2 \
# -t llama-2-7b:latest .

# Build chat completion model
# docker build \
# --build-arg LLAMA_VERSION=llama-2-7b-chat \
# --build-arg SRC_DIR=pkg/presets/llama-2-chat \
# -t llama-2-7b-chat:latest .

FROM nvcr.io/nvidia/pytorch:23.06-py3
WORKDIR /workspace

Expand All @@ -9,4 +21,25 @@ RUN pip install -e .
RUN pip install fastapi pydantic
RUN pip install 'uvicorn[standard]'

ADD pkg/presets/llama-2 /workspace/llama/llama-2
ARG LLAMA_VERSION
ARG SRC_DIR

# Conditional logic based on LLAMA_VERSION argument
RUN if [ "$LLAMA_VERSION" = "llama-2-7b" ]; then \
wget -P ${SRC_DIR}/weights <URL_for_7b_chat_model>; \
elif [ "$LLAMA_VERSION" = "llama-2-13b" ]; then \
wget -P ${SRC_DIR}/weights <URL_for_13b_chat_model>; \
elif [ "$LLAMA_VERSION" = "llama-2-70b" ]; then \
wget -P ${SRC_DIR}/weights <URL_for_70b_chat_model>; \
elif [ "$LLAMA_VERSION" = "llama-2-7b-chat" ]; then \
wget -P ${SRC_DIR}/weights <URL_for_7b_model>; \
elif [ "$LLAMA_VERSION" = "llama-2-13b-chat" ]; then \
wget -P ${SRC_DIR}/weights <URL_for_13b_model>; \
elif [ "$LLAMA_VERSION" = "llama-2-70b-chat" ]; then \
wget -P ${SRC_DIR}/weights <URL_for_70b_model>; \
else \
echo "Invalid or missing LLAMA_VERSION"; \
exit 1; \
fi

ADD ${SRC_DIR} /workspace/llama/llama-2