In this project, you will apply the skills you have acquired in this course to operationalize a Machine Learning Microservice API.
You are given a pre-trained, sklearn
model that has been trained to predict housing prices in Boston according to several features, such as average rooms in a home and data about highway access, teacher-to-pupil ratios, and so on. You can read more about the data, which was initially taken from Kaggle, on the data source site. This project tests your ability to operationalize a Python flask app—in a provided file, app.py
—that serves out predictions (inference) about housing prices through API calls. This project could be extended to any pre-trained machine learning model, such as those for image recognition and data labeling.
Your project goal is to operationalize this working, machine learning microservice using kubernetes, which is an open-source system for automating the management of containerized applications. In this project you will:
- Test your project code using linting
- Complete a Dockerfile to containerize this application
- Deploy your containerized application using Docker and make a prediction
- Improve the log statements in the source code for this application
- Configure Kubernetes and create a Kubernetes cluster
- Deploy a container using Kubernetes and make a prediction
- Upload a complete Github repo with CircleCI to indicate that your code has been tested
You can find a detailed project rubric, here.
The final implementation of the project will showcase your abilities to operationalize production microservices.
- Create a virtualenv with Python 3.7 and activate it. Refer to this link for help on specifying the Python version in the virtualenv.
python3 -m pip install --user virtualenv
# You should have Python 3.7 available in your host.
# Check the Python path using `which python3`
# Use a command similar to this one:
python3 -m virtualenv --python=<path-to-Python3.7> .devops
source .devops/bin/activate
- Run
make install
to install the necessary dependencies
- Standalone:
python app.py
- Run in Docker:
./run_docker.sh
- Run in Kubernetes:
./run_kubernetes.sh
- Setup and Configure Docker locally
- Setup and Configure Kubernetes locally
- Create Flask app in Container
./run_docker.sh
./upload_docker.sh
- Run via kubectl
./run_kubernetes.sh
- run
./make_prediction.sh
--> that should return something like:Port: 8000 { "prediction": [ 20.35373177134412 ] }