-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
JAlcocerT
committed
Dec 13, 2023
1 parent
a9e6a06
commit b7a7ce2
Showing
6 changed files
with
280 additions
and
2 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
#pip install diagram | ||
|
||
from diagrams import Diagram, Cluster | ||
from diagrams.custom import Custom | ||
from diagrams.onprem.database import MongoDB | ||
from diagrams.onprem.analytics import Metabase | ||
from diagrams.generic.storage import Storage | ||
from urllib.request import urlretrieve | ||
|
||
# Define the URL for the Python icon and the local file name | ||
python_url = "https://github.com/abranhe/languages.abranhe.com/raw/master/languages/python.png" | ||
python_icon = "python.png" | ||
|
||
# Download the Python icon from the URL | ||
urlretrieve(python_url, python_icon) | ||
|
||
# Specify the desired output filename | ||
output_filename = "./your_workflow_diagram" | ||
|
||
with Diagram("Python to MongoDB to Metabase Workflow", show=False, filename=output_filename): | ||
custom_icon = Custom("Custom", "./DHT11.png") | ||
python_code = Custom("Python Code", "./python.png") | ||
mongodb = MongoDB("MongoDB") | ||
metabase = Metabase("Metabase") | ||
|
||
custom_icon >> python_code >> mongodb >> metabase |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
--- | ||
title: Machine Learning on SBCs | ||
author: JAlcocerT | ||
date: 2024-01-01 00:10:00 +0800 | ||
categories: [IoT & Data Analytics] | ||
tags: [Sensors,Python,MongoDB] | ||
image: | ||
path: /img/metabase.png | ||
alt: IoT Project with Python, MongoDB, DHT11/22 sensors and Metabase. | ||
render_with_liquid: false | ||
--- | ||
|
||
|
||
## RPi with Vision | ||
|
||
|
||
## FAQ | ||
|
||
|
||
### BI Tools for EDA | ||
|
||
* <https://www.opensourcealternative.to/project/LightDashs> | ||
* <https://www.opensourcealternative.to/project/Metabase> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,83 @@ | ||
--- | ||
title: Machine Learning Ops with SBCs | ||
author: JAlcocerT | ||
date: 2024-01-01 00:10:00 +0800 | ||
categories: [IoT & Data Analytics] | ||
tags: [Sensors,Python,MongoDB] | ||
image: | ||
path: /img/metabase.png | ||
alt: IoT Project with Python, MongoDB, DHT11/22 sensors and Metabase. | ||
render_with_liquid: false | ||
--- | ||
|
||
|
||
|
||
|
||
## Gitea | ||
|
||
* <https://fossengineer.com/selfhosting-Gitea-docker/> | ||
|
||
## Jenkins | ||
|
||
<https://fossengineer.com/selfhosting-jenkins-ci-cd/> | ||
|
||
## SonarQube | ||
|
||
GitHub Actions can be used to trigger SonarQube scans on various events like push or pull requests. | ||
You can add a step in your GitHub Actions workflow to run the SonarQube scanner. This involves setting up the SonarQube server details and running the analysis as part of your GitHub Actions pipeline. | ||
|
||
### SQ with Jenkins | ||
|
||
Jenkins offers a SonarQube plugin that allows for easy integration. | ||
|
||
Once the plugin is installed, you can configure a Jenkins job to trigger SonarQube scans. This can include providing the SonarQube server configuration and specifying the project key and token in the job configuration. | ||
|
||
### SQ with Gitea | ||
While Gitea does not have direct plugin support like Jenkins, you can still integrate SonarQube into your Gitea pipelines using webhook triggers or by manually configuring CI/CD tools (like Drone, which integrates with Gitea) to run SonarQube scans. | ||
|
||
Webhooks: Use webhooks in Gitea to trigger external CI/CD tools on events like push or pull requests. | ||
CI/CD Tool: In your CI/CD pipeline configuration (like a .drone.yml file for Drone CI), add steps to execute the SonarQube scanner. | ||
|
||
|
||
## Cortex | ||
|
||
Open Source Alternative To AWS SageMaker | ||
|
||
Production infrastructure for machine learning at scale | ||
|
||
|
||
* <https://github.com/cortexlabs/cortex> | ||
* <https://docs.cortexlabs.com/> | ||
|
||
|
||
|
||
|
||
## FAQ | ||
|
||
### What are microservices? | ||
|
||
### What are Web-Hooks? | ||
|
||
A webhook is like a doorbell. When certain events happen in one system (like a new post on a blog or a new commit in a repository), it automatically sends a notification to another system. It's a way for apps to provide other applications with real-time information. | ||
|
||
* How It Works: A webhook delivers data to other applications as it happens, meaning you get data immediately. You set up a webhook by providing a URL to the system you want to receive the notifications. When an event occurs, the system makes an HTTP request (usually POST) to the URL you provided. | ||
* Use Case Example: A common use of webhooks is in Continuous Integration/Continuous Deployment (CI/CD) pipelines. For example, GitHub can use a webhook to notify a CI server like Jenkins to start a new build whenever code is pushed to a repository. | ||
|
||
### What are API calls? | ||
|
||
An API call is like making a phone call to a specific service. You request the information or service you need, and the system responds back. It's a way for applications to interact and request data from each other. | ||
|
||
* How It Works: An API call is a manual process; you have to make the request to get the data. It’s like asking, "Do you have any new data?" The request is usually made via HTTP (GET, POST, PUT, DELETE), and the server processes the request and sends back a response. | ||
* Use Case Example: If you have an application that needs to get the latest weather data, it can make an API call to a weather service. The application sends a request, and the weather service responds with the latest weather information. | ||
|
||
### WebHooks vs API Calls | ||
|
||
* Initiation: | ||
* Webhook: Automatically initiated by the source system when an event occurs. | ||
* API Call: Manually initiated by the requesting system. | ||
* Purpose: | ||
* Webhook: Used for real-time notifications. | ||
* API Call: Used for requesting or sending data on demand. | ||
* Direction: | ||
* Webhook: One-way from the source to the receiver. | ||
* API Call: Two-way communication between the requester and the server. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,145 @@ | ||
--- | ||
title: Deploying Kubernetes in SBCs - K3s | ||
author: JAlcocerT | ||
date: 2024-01-01 00:10:00 +0800 | ||
categories: [IoT & Data Analytics] | ||
tags: [Sensors,Python,MongoDB] | ||
image: | ||
path: /img/metabase.png | ||
alt: IoT Project with Python, MongoDB, DHT11/22 sensors and Metabase. | ||
render_with_liquid: false | ||
--- | ||
|
||
Kubernetes - A tool to manage and automate automated workflows in the cloud. It orchestrates the infrastructure to accomodate the changes in workload. | ||
|
||
The developer just need to define a yml with the desired state of your K8s cluster. | ||
|
||
In this project we will be collecting **Temperature and Humidity Data** from a DHT11 or a DHT22 Sensor working together with a Raspberry Pi. | ||
|
||
The data store will be in MongoDB, which will live in a Docker container. | ||
|
||
Rancher is an open source container management platform built for organizations that deploy containers in production. Rancher makes it easy to run Kubernetes everywhere, meet IT requirements, and empower DevOps teams. | ||
|
||
## Rancher: k3s | ||
|
||
Setting up a High-availability K3s Kubernetes Cluster for Rancher. | ||
|
||
We just need to [have Docker installed](https://jalcocert.github.io/RPi/posts/selfhosting-with-docker/) and thanks to Rancher we can **run our own Kubernetes Cluster**. | ||
|
||
* <https://hub.docker.com/r/rancher/k3s/tags> | ||
* <https://github.com/rancher/rancher> | ||
* <https://www.rancher.com/community> | ||
|
||
### Master Node | ||
|
||
|
||
|
||
```yml | ||
version: '3' | ||
services: | ||
k3s: | ||
image: rancher/k3s | ||
container_name: k3s | ||
privileged: true | ||
volumes: | ||
- k3s-server:/var/lib/rancher/k3s | ||
ports: | ||
- "6443:6443" | ||
restart: unless-stopped | ||
|
||
volumes: | ||
k3s-server: | ||
|
||
#docker run -d --name k3s --privileged rancher/k3s | ||
|
||
``` | ||
|
||
|
||
## Using kubectl | ||
|
||
**kubectl** is a command-line tool that allows you to run commands against Kubernetes clusters. | ||
|
||
It is the primary tool for interacting with and managing Kubernetes clusters, providing a versatile way to handle all aspects of cluster operations. | ||
|
||
Common kubectl Commands | ||
kubectl get pods: Lists all pods in the current namespace. | ||
kubectl create -f <filename>: Creates a resource specified in a YAML or JSON file. | ||
kubectl apply -f <filename>: Applies changes to a resource from a file. | ||
kubectl delete -f <filename>: Deletes a resource specified in a file. | ||
kubectl describe <resource> <name>: Shows detailed information about a specific resource. | ||
kubectl logs <pod_name>: Retrieves logs from a specific pod. | ||
kubectl exec -it <pod_name> -- /bin/bash: Executes a command, like opening a bash shell, in a specific container of a pod. | ||
|
||
### Slaves | ||
|
||
|
||
## FAQ | ||
|
||
### What are K8s PODs? | ||
|
||
### Master and Nodes with Differente CPU archs? | ||
|
||
### Rancher Alternatives | ||
|
||
* | ||
* | ||
|
||
### What is it Kubeflow? | ||
|
||
* Kubeflow is the machine learning toolkit for Kubernetes: | ||
* <https://www.kubeflow.org/> | ||
* <https://github.com/kubeflow/examples> | ||
|
||
Kubeflow is an **open-source platform for machine learning and MLOps on Kubernetes**. | ||
|
||
It was introduced by Google in 2017 and has since grown to include many other contributors and projects. | ||
|
||
Kubeflow aims to make deployments of machine learning workflows on Kubernetes simple, portable and scalable3. Kubeflow offers services for creating and managing Jupyter notebooks, TensorFlow training, model serving, and pipelines across different frameworks and infrastructures3. | ||
|
||
Purpose: Kubeflow is an open-source project designed to make deployments of machine learning (ML) workflows on Kubernetes easier, scalable, and more flexible. | ||
|
||
Scope: It encompasses a broader range of ML lifecycle stages, including preparing data, training models, serving models, and managing workflows. | ||
|
||
Kubernetes-Based: It’s specifically built for Kubernetes, leveraging its capabilities for managing complex, distributed systems. | ||
|
||
Components: Kubeflow includes various components like Pipelines, Katib for hyperparameter tuning, KFServing for model serving, and integration with Jupyter notebooks. | ||
|
||
Target Users: It's more suitable for organizations and teams looking to deploy and manage ML workloads at scale in a Kubernetes environment. | ||
|
||
### What it is MLFlow? | ||
|
||
* <https://mlflow.org/> | ||
* <https://github.com/mlflow/mlflow> | ||
|
||
Purpose: MLflow is an open-source platform primarily for managing the end-to-end machine learning lifecycle, focusing on tracking experiments, packaging code into reproducible runs, and sharing and deploying models. | ||
|
||
Scope: It’s more focused on the experiment tracking, model versioning, and serving aspects of the ML lifecycle. | ||
|
||
Platform-Agnostic: MLflow is designed to work across various environments and platforms. It's not tied to Kubernetes and can run on any system where Python is supported. | ||
|
||
Components: Key components of MLflow include MLflow Tracking, MLflow Projects, MLflow Models, and MLflow Registry. | ||
|
||
Target Users: It's suitable for both individual practitioners and teams, facilitating the tracking and sharing of experiments, models, and workflows. | ||
|
||
While they serve different purposes, Kubeflow and MLflow can be used together in a larger ML system. | ||
|
||
For instance, you might use MLflow to track experiments and manage model versions, and then deploy these models at scale using Kubeflow on a Kubernetes cluster. | ||
|
||
Such integration would leverage the strengths of both platforms: MLflow for experiment tracking and Kubeflow for scalable, Kubernetes-based deployment and management of ML workflows. | ||
In summary, while Kubeflow and MLflow are not directly related and serve different aspects of the ML workflow, they can be complementary in a comprehensive ML operations (MLOps) strategy. | ||
|
||
### Kustomize | ||
|
||
* What It Is: Kustomize is a standalone tool to customize Kubernetes objects through a declarative configuration file. It's also part of kubectl since v1.14. | ||
* Usage in DevOps/MLOps: | ||
* Configuration Management: Manage Kubernetes resource configurations without templating. | ||
* Environment-Specific Adjustments: Customize applications for different environments without altering the base resource definitions. | ||
* Overlay Approach: Overlay different configurations (e.g., patches) over a base configuration, allowing for reusability and simplicity. | ||
|
||
|
||
### Useful Videos to Learn more about K8s | ||
|
||
* <https://www.youtube.com/watch?v=PziYflu8cB8> | ||
* <https://www.youtube.com/watch?v=s_o8dwzRlu4> | ||
* <https://www.youtube.com/watch?v=DCoBcpOA7W4> | ||
* <https://www.youtube.com/watch?v=n-fAf2mte6M> |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.