Skip to content

Commit

Permalink
Merge branch 'main' of https://github.com/JAlcocerT/RPi into main
Browse files Browse the repository at this point in the history
  • Loading branch information
JAlcocerT committed Mar 2, 2024
2 parents a1959b6 + bce7d1e commit 330bc57
Show file tree
Hide file tree
Showing 5 changed files with 441 additions and 0 deletions.
180 changes: 180 additions & 0 deletions _posts/2023-12-11-rpi-gps-superset.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,180 @@
---
title: RPi IoT Project - GPS Data (VK-162) with Apache Superset
author: JAlcocerT
date: 2023-12-11 00:10:00 +0800
categories: [IoT & Data Analytics]
tags: [Sensors,Python,MongoDB]
image:
path: /img/superset.png
alt: IoT Project with Python, MongoDB, DHT11/22 sensors and Metabase.
render_with_liquid: false
---

* <https://www.youtube.com/watch?v=Z7cJ59sixpk&t=197s>
<https://www.youtube.com/watch?v=3ysOqliO6F8>

## ToDo list

- [ ] Job Done!
+ [ ] Setup BI - Superset
+ [ ] Hardware Checks
+ [ ] Connecting everything

<https://www.youtube.com/watch?v=Z7cJ59sixpk>

## Apache Superset Setup

Apache Superset is a [Free BI Web Tool](https://superset.apache.org/docs/intro/) that we can [use with our RPi projects locally](https://superset.apache.org/docs/installation/installing-superset-using-docker-compose/).


```sh
git clone https://github.com/apache/superset.git
cd superset

docker compose -f docker-compose-non-dev.yml up -d

#git checkout 3.0.0
#TAG=3.0.0 docker compose -f docker-compose-non-dev.yml up
```

Then, just use Superset with its UI at: **http://localhost:8088/login/**

![Desktop View](/img/superset-working.png){: width="972" height="589" }
_DHT22 connection to a Raspberry Pi 4_

*Default credentials are: admin/admin*

- [ ] Job Done!
+ [x] Setup BI - Superset
+ [ ] Hardware Checks
+ [ ] Connecting everything


## Sensors

* VK-162
* Columbus V-800 + [gpsd-gps](https://gpsd.io/) client
* BY-353 USB GPS


## FAQ

### Apache SupetSet with Portainer

This is the Stack in case that you can to deploy Superset with Portainer:

```yml
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
x-superset-image: &superset-image apachesuperset.docker.scarf.sh/apache/superset:${TAG:-latest-dev}
x-superset-depends-on: &superset-depends-on
- db
- redis
x-superset-volumes:
&superset-volumes # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container
- ./docker:/app/docker
- superset_home:/app/superset_home

version: "3.7"
services:
redis:
image: redis:7
container_name: superset_cache
restart: unless-stopped
volumes:
- redis:/data

db:
env_file: docker/.env-non-dev
image: postgres:14
container_name: superset_db
restart: unless-stopped
volumes:
- db_home:/var/lib/postgresql/data
- ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d

superset:
env_file: docker/.env-non-dev
image: *superset-image
container_name: superset_app
command: ["/app/docker/docker-bootstrap.sh", "app-gunicorn"]
user: "root"
restart: unless-stopped
ports:
- 8088:8088
depends_on: *superset-depends-on
volumes: *superset-volumes

superset-init:
image: *superset-image
container_name: superset_init
command: ["/app/docker/docker-init.sh"]
env_file: docker/.env-non-dev
depends_on: *superset-depends-on
user: "root"
volumes: *superset-volumes
healthcheck:
disable: true

superset-worker:
image: *superset-image
container_name: superset_worker
command: ["/app/docker/docker-bootstrap.sh", "worker"]
env_file: docker/.env-non-dev
restart: unless-stopped
depends_on: *superset-depends-on
user: "root"
volumes: *superset-volumes
healthcheck:
test:
[
"CMD-SHELL",
"celery -A superset.tasks.celery_app:app inspect ping -d celery@$$HOSTNAME",
]

superset-worker-beat:
image: *superset-image
container_name: superset_worker_beat
command: ["/app/docker/docker-bootstrap.sh", "beat"]
env_file: docker/.env-non-dev
restart: unless-stopped
depends_on: *superset-depends-on
user: "root"
volumes: *superset-volumes
healthcheck:
disable: true

volumes:
superset_home:
external: false
db_home:
external: false
redis:
external: false
```
### Apache Supserset DS's and API
* Data Sources: <https://superset.apache.org/docs/databases/db-connection-ui>
* API info: <https://superset.apache.org/docs/api>
### PhyPhox
* You can also save GPS data thanks to the [F/OSS PhyPhox](https://github.com/phyphox/phyphox-android) - An app that allow us to use phone's sensors for physics experiments:
* Also available for [ESP32 with micropython](https://github.com/phyphox/phyphox-micropython)
* And [also for Arduino](https://github.com/phyphox/phyphox-arduino)
83 changes: 83 additions & 0 deletions _posts/2024-01-01-MLOps.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
---
title: Machine Learning Ops with SBCs
author: JAlcocerT
date: 2024-01-01 00:10:00 +0800
categories: [IoT & Data Analytics]
tags: [Sensors,Python,MongoDB]
image:
path: /img/metabase.png
alt: IoT Project with Python, MongoDB, DHT11/22 sensors and Metabase.
render_with_liquid: false
---




## Gitea

* <https://fossengineer.com/selfhosting-Gitea-docker/>

## Jenkins

<https://fossengineer.com/selfhosting-jenkins-ci-cd/>

## SonarQube

GitHub Actions can be used to trigger SonarQube scans on various events like push or pull requests.
You can add a step in your GitHub Actions workflow to run the SonarQube scanner. This involves setting up the SonarQube server details and running the analysis as part of your GitHub Actions pipeline.

### SQ with Jenkins

Jenkins offers a SonarQube plugin that allows for easy integration.

Once the plugin is installed, you can configure a Jenkins job to trigger SonarQube scans. This can include providing the SonarQube server configuration and specifying the project key and token in the job configuration.

### SQ with Gitea
While Gitea does not have direct plugin support like Jenkins, you can still integrate SonarQube into your Gitea pipelines using webhook triggers or by manually configuring CI/CD tools (like Drone, which integrates with Gitea) to run SonarQube scans.

Webhooks: Use webhooks in Gitea to trigger external CI/CD tools on events like push or pull requests.
CI/CD Tool: In your CI/CD pipeline configuration (like a .drone.yml file for Drone CI), add steps to execute the SonarQube scanner.


## Cortex

Open Source Alternative To AWS SageMaker

Production infrastructure for machine learning at scale


* <https://github.com/cortexlabs/cortex>
* <https://docs.cortexlabs.com/>




## FAQ

### What are microservices?

### What are Web-Hooks?

A webhook is like a doorbell. When certain events happen in one system (like a new post on a blog or a new commit in a repository), it automatically sends a notification to another system. It's a way for apps to provide other applications with real-time information.

* How It Works: A webhook delivers data to other applications as it happens, meaning you get data immediately. You set up a webhook by providing a URL to the system you want to receive the notifications. When an event occurs, the system makes an HTTP request (usually POST) to the URL you provided.
* Use Case Example: A common use of webhooks is in Continuous Integration/Continuous Deployment (CI/CD) pipelines. For example, GitHub can use a webhook to notify a CI server like Jenkins to start a new build whenever code is pushed to a repository.

### What are API calls?

An API call is like making a phone call to a specific service. You request the information or service you need, and the system responds back. It's a way for applications to interact and request data from each other.

* How It Works: An API call is a manual process; you have to make the request to get the data. It’s like asking, "Do you have any new data?" The request is usually made via HTTP (GET, POST, PUT, DELETE), and the server processes the request and sends back a response.
* Use Case Example: If you have an application that needs to get the latest weather data, it can make an API call to a weather service. The application sends a request, and the weather service responds with the latest weather information.

### WebHooks vs API Calls

* Initiation:
* Webhook: Automatically initiated by the source system when an event occurs.
* API Call: Manually initiated by the requesting system.
* Purpose:
* Webhook: Used for real-time notifications.
* API Call: Used for requesting or sending data on demand.
* Direction:
* Webhook: One-way from the source to the receiver.
* API Call: Two-way communication between the requester and the server.
Loading

0 comments on commit 330bc57

Please sign in to comment.