Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dockerfile #176

Merged
merged 11 commits into from
Nov 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .github/workflows/create-release-n-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,9 @@ jobs:
- name: Test pip install multiple adapters
run: pip install flowcept[mlflow,tensorboard]
- name: Install our dependencies
run: pip install flowcept[all] # This will install all dependencies, for all adapters and dev deps.
run: pip install flowcept[all]
- name: Install ml_dev dependencies
run: pip install flowcept[ml_dev]
- name: Pip list
run: pip list
- name: Run Docker Compose
Expand Down
24 changes: 24 additions & 0 deletions .github/workflows/run-tests-in-container.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: Tests inside a Container
on: [pull_request]

jobs:

build:
runs-on: ubuntu-latest
timeout-minutes: 40
if: "!contains(github.event.head_commit.message, 'CI Bot')"

steps:
- uses: actions/checkout@v4

- name: Show OS Info
run: '[[ "$OSTYPE" == "linux-gnu"* ]] && { echo "OS Type: Linux"; (command -v lsb_release &> /dev/null && lsb_release -a) || cat /etc/os-release; uname -r; } || [[ "$OSTYPE" == "darwin"* ]] && { echo "OS Type: macOS"; sw_vers; uname -r; } || echo "Unsupported OS type: $OSTYPE"'

- name: Build Flowcept's image
run: make build

- name: Start dependent services (Mongo and Redis)
run: make services

- name: Run tests in container
run: make tests-in-container
1 change: 1 addition & 0 deletions .github/workflows/run-tests-kafka.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ jobs:
run: |
python -m pip install --upgrade pip
python -m pip install .[all]
python -m pip install .[ml_dev]

- name: Run docker compose
run: docker compose -f deployment/compose-kafka.yml up -d
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/run-tests-py11.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@ jobs:
run: |
python -m pip install --upgrade pip
python -m pip install .[all]
python -m pip install .[ml_dev]

- name: List installed packages
run: pip list
Expand Down
21 changes: 17 additions & 4 deletions .github/workflows/run-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,30 +32,43 @@ jobs:
- name: Install default dependencies and run simple test
run: |
pip install .
python examples/simple_instrumented_script.py
python examples/simple_instrumented_script.py | tee output.log
cat output.log
grep -q "ERROR" output.log && exit 1
rm output.log

- name: Install Dask dependencies alone and run a simple Dask test
run: |
pip uninstall flowcept -y
pip install .[dask]
python examples/dask_example.py
python examples/dask_example.py | tee output.log
cat output.log
grep -q "ERROR" output.log && exit 1
rm output.log

- name: Install MLFlow dependencies alone and run a simple MLFlow test
run: |
pip uninstall flowcept -y
pip install .[mlflow]
python examples/mlflow_example.py
python examples/mlflow_example.py | tee output.log
cat output.log
grep -q "ERROR" output.log && exit 1
rm output.log

- name: Install Tensorboard dependencies alone and run a simple Tensorboard test
run: |
pip uninstall flowcept -y
pip install .[tensorboard]
python examples/tensorboard_example.py
python examples/tensorboard_example.py | tee output.log
cat output.log
grep -q "ERROR" output.log && exit 1
rm output.log

- name: Install all dependencies
run: |
python -m pip install --upgrade pip
python -m pip install .[all]
python -m pip install .[ml_dev]

- name: List installed packages
run: pip list
Expand Down
46 changes: 33 additions & 13 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,16 +1,19 @@
# Show help, place this first so it runs with just `make`
help:
@printf "\nCommands:\n"
@printf "\033[32mchecks\033[0m run ruff linter and formatter checks\n"
@printf "\033[32mreformat\033[0m run ruff linter and formatter\n"
@printf "\033[32mclean\033[0m remove cache directories and Sphinx build output\n"
@printf "\033[32mdocs\033[0m build HTML documentation using Sphinx\n"
@printf "\033[32mservices\033[0m run services using Docker\n"
@printf "\033[32mservices-stop\033[0m stop the running Docker services\n"
@printf "\033[32mtests\033[0m run unit tests with pytest\n"
@printf "\033[32mtests-all\033[0m run all unit tests with pytest, including very long-running ones\n"
@printf "\033[32mtests-notebooks\033[0m tests the notebooks, using pytest\n"

@printf "\033[32mbuild\033[0m build the Docker image\n"
@printf "\033[32mrun\033[0m run the Docker container\n"
@printf "\033[32mliveness\033[0m check if the services are alive\n"
@printf "\033[32mservices\033[0m run services using Docker\n"
@printf "\033[32mservices-stop\033[0m stop the running Docker services\n"
@printf "\033[32mtests\033[0m run unit tests with pytest\n"
@printf "\033[32mtests-in-container\033[0m run unit tests with pytest inside Flowcept's container\n"
@printf "\033[32mtests-all\033[0m run all unit tests with pytest, including very long-running ones\n"
@printf "\033[32mtests-notebooks\033[0m tests the notebooks, using pytest\n"
@printf "\033[32mclean\033[0m remove cache directories and Sphinx build output\n"
@printf "\033[32mdocs\033[0m build HTML documentation using Sphinx\n"
@printf "\033[32mchecks\033[0m run ruff linter and formatter checks\n"
@printf "\033[32mreformat\033[0m run ruff linter and formatter\n"

# Run linter and formatter checks using ruff
checks:
Expand All @@ -25,13 +28,15 @@ reformat:
clean:
rm -rf .ruff_cache
rm -rf .pytest_cache
rm -rf mlruns
rm -rf mnist_data
rm -rf tensorboard_events
rm -f docs_dump_tasks_*
rm -f dump_test.json
rm -f flowcept.log
rm -f mlflow.db
find . -type f -name "*.log" -exec rm -f {} \;
find . -type f -name "*.pth" -exec rm -f {} \;
find . -type f -name "mlflow.db" -exec rm -f {} \;
find . -type d -name "mlruns" -exec rm -rf {} \;
find . -type d -name "__pycache__" -exec rm -rf {} \; 2>/dev/null
sphinx-build -M clean docs docs/_build

# Build the HTML documentation using Sphinx
Expand All @@ -47,6 +52,20 @@ services:
services-stop:
docker compose --file deployment/compose.yml down --volumes

# Build a new Docker image for Flowcept
build:
bash deployment/build-image.sh

run:
docker run --rm -v $(shell pwd):/flowcept -e KVDB_HOST=flowcept_redis -e MQ_HOST=flowcept_redis -e MONGO_HOST=flowcept_mongo --network flowcept_default -it flowcept

tests-in-container:
docker run --rm -v $(shell pwd):/flowcept -e KVDB_HOST=flowcept_redis -e MQ_HOST=flowcept_redis -e MONGO_HOST=flowcept_mongo --network flowcept_default flowcept /opt/conda/envs/flowcept/bin/pytest --ignore=tests/decorator_tests/ml_tests

# This command can be removed once we have our CLI
liveness:
python -c 'from flowcept import Flowcept; print(Flowcept.services_alive())'

# Run unit tests using pytest
.PHONY: tests
tests:
Expand All @@ -59,3 +78,4 @@ tests-notebooks:
.PHONY: tests-all
tests-all:
pytest

27 changes: 27 additions & 0 deletions deployment/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Use the command `make build` to build this image.
FROM miniconda:local

# Install vim
RUN apt-get update && \
apt-get install -y vim curl wget make \
&& rm -rf /var/lib/apt/lists/*

WORKDIR /flowcept

COPY pyproject.toml Makefile README.md ./
COPY src ./src
COPY resources ./resources
COPY notebooks ./notebooks
COPY tests ./tests
COPY examples ./examples

RUN export FLOWCEPT_SETTINGS_PATH=$(realpath resources/sample_settings.yaml) \
&& echo "export FLOWCEPT_SETTINGS_PATH=$FLOWCEPT_SETTINGS_PATH" >> ~/.bashrc

RUN conda create -n flowcept python=3.11.10 -y \
&& echo "conda activate flowcept" >> ~/.bashrc

RUN conda run -n flowcept pip install -e .[all] # This is an overkill and will install many things you might not need. Please modify deployment/Dockerfile in case you do not need to install "all" dependencies.

# Default command
CMD ["bash"]
37 changes: 37 additions & 0 deletions deployment/build-image.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
#!/bin/bash

if [ ! -d "src" ]; then
echo "Error: 'src' directory does not exist in the current path. Please run it from the project root."
exit 1
fi

# Download the Miniconda Dockerfile
echo "Downloading Miniconda Dockerfile..."
curl --silent -o Dockerfile_miniconda https://raw.githubusercontent.com/anaconda/docker-images/refs/heads/main/miniconda3/debian/Dockerfile
cat Dockerfile_miniconda

# Build the Miniconda image locally
echo "Building miniconda:local image..."
docker build -t miniconda:local -f Dockerfile_miniconda .
rm Dockerfile_miniconda

# Check if the Miniconda build failed
if [ $? -ne 0 ]; then
echo "Error: Miniconda image build failed."
exit 1
fi

echo "Miniconda image built successfully."
# Step 4: Build the flowcept image with both 'latest' and versioned tags
echo "Building flowcept image with latest and version tags..."
docker build -t flowcept:latest -f deployment/Dockerfile .

# Check if the flowcept build succeeded
if [ $? -eq 0 ]; then
echo "Flowcept image built successfully with tags 'latest'."
echo "You can now run it using $> make run"
else
echo "Failed to build flowcept image."
exit 1
fi

31 changes: 0 additions & 31 deletions deployment/compose-full.yml

This file was deleted.

20 changes: 10 additions & 10 deletions deployment/compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,18 +9,18 @@ services:
flowcept_mongo:
container_name: flowcept_mongo
image: mongo:latest
# volumes:
# - /Users/rsr/Downloads/mongo_data/db:/data/db
ports:
- 27017:27017

networks:
flowcept:
driver: bridge


# # This is just for the cases where one does not want to use the same Redis instance for caching and messaging, but
# # it's not required to have separate instances.
# # local_interceptor_cache:
# # container_name: local_interceptor_cache
# # image: redis
# # ports:
# # - 60379:6379
# This is just for the cases where one does not want to use the same Redis instance for caching and messaging, but
# it's not required to have separate instances.
# local_interceptor_cache:
# container_name: local_interceptor_cache
# image: redis
# ports:
# - 60379:6379

1 change: 0 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,6 @@ all = [
"flowcept[responsibleai]",
"flowcept[tensorboard]",
"flowcept[dev]",
"flowcept[ml_dev]"
]

[tool.hatch.version]
Expand Down
6 changes: 0 additions & 6 deletions resources/sample_settings.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -91,23 +91,17 @@ adapters:
file_path: mlflow.db
log_params: ['*']
log_metrics: ['*']
redis_host: localhost
redis_port: 6379
watch_interval_sec: 2

tensorboard:
kind: tensorboard
file_path: tensorboard_events
log_tags: ['scalars', 'hparams', 'tensors']
log_metrics: ['accuracy']
redis_host: localhost
redis_port: 6379
watch_interval_sec: 5

dask:
kind: dask
redis_host: localhost
redis_port: 6379
worker_should_get_input: true
scheduler_should_get_input: true
worker_should_get_output: true
Expand Down
11 changes: 5 additions & 6 deletions src/flowcept/configs.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,10 @@

if not os.path.exists(SETTINGS_PATH):
SETTINGS_PATH = None
import importlib.resources
from importlib import resources

with importlib.resources.open_text("resources", "sample_settings.yaml") as f:
with resources.files("resources").joinpath("sample_settings.yaml").open("r") as f:
settings = OmegaConf.load(f)

else:
settings = OmegaConf.load(SETTINGS_PATH)

Expand Down Expand Up @@ -82,9 +81,9 @@
# MongoDB Settings #
######################

MONGO_URI = settings["mongodb"].get("uri", os.environ.get("MONGO_URI", None))
MONGO_HOST = settings["mongodb"].get("host", os.environ.get("MONGO_HOST", "localhost"))
MONGO_PORT = int(settings["mongodb"].get("port", os.environ.get("MONGO_PORT", "27017")))
MONGO_URI = os.environ.get("MONGO_URI", settings["mongodb"].get("uri", None))
MONGO_HOST = os.environ.get("MONGO_HOST", settings["mongodb"].get("host", "localhost"))
MONGO_PORT = int(os.environ.get("MONGO_PORT", settings["mongodb"].get("port", 27017)))
MONGO_DB = settings["mongodb"].get("db", PROJECT_NAME)
MONGO_CREATE_INDEX = settings["mongodb"].get("create_collection_index", True)

Expand Down
2 changes: 0 additions & 2 deletions src/flowcept/flowceptor/adapters/dask/dask_dataclasses.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,6 @@
class DaskSettings(BaseSettings):
"""Dask settings."""

redis_port: int
redis_host: str
worker_should_get_input: bool
worker_should_get_output: bool
scheduler_should_get_input: bool
Expand Down
2 changes: 1 addition & 1 deletion src/flowcept/flowceptor/adapters/dask/dask_plugins.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ def transition(self, key, start, finish, *args, **kwargs):
"""Get the transition."""
self.interceptor.callback(key, start, finish, args, kwargs)

def close(self):
async def close(self):
"""Close it."""
self.interceptor.logger.debug("Going to close scheduler!")
self.interceptor.stop()
Expand Down
Loading