Skip to content

Commit

Permalink
Release garbo to production (#495)
Browse files Browse the repository at this point in the history
* Improve the first-time local development experience for new contributors (#489)

* Allow starting only the API to simplify local frontend development

* Clarofy documentation for how to start only the API, restoring a DB local backup, and testing migrations

* Ensure the database is created when restoring

* Fix #493 - Allow setting individual `scope2` values to `null` to get correct calculations of total emissions (#494)
  • Loading branch information
Greenheart authored Dec 16, 2024
2 parents a703539 + c3518b0 commit e80b14e
Show file tree
Hide file tree
Showing 5 changed files with 119 additions and 37 deletions.
90 changes: 72 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,75 +80,129 @@ This project expects some containers running in the background to work properly.
The simplest way to start the containers the first time is to run the following docker commands.

```bash
docker run -d -p 6379:6379 --name garbo_redis redis
docker run -d -p 5432:5432 --name garbo_postgres -e POSTGRES_PASSWORD=mysecretpassword postgres

# These are only necessary to develop the AI pipeline. Feel free to skip them if you only plan to develop the frontend and/or the API.
docker run -d -p 6379:6379 --name garbo_redis redis
docker run -d -p 8000:8000 --name garbo_chroma chromadb/chroma
docker run -d -p 5001:5001 --name garbo_ingestor ghcr.io/nlmatics/nlm-ingestor
```

Next time, you can start the containers back up using

```sh
docker start garbo_redis garbo_postgres garbo_chroma garbo_ingestor
docker start garbo_postgres garbo_redis garbo_chroma garbo_ingestor
```

Or if you only plan to develop the frontend and/or the API, this is enough:

```sh
docker start garbo_postgres
```

You may want a graphical user interface to make it easier to manage your local containers. [Podman desktop](https://podman-desktop.io/) and [Rancher desktop](https://rancherdesktop.io/) are both good alternatives

### Seeding the database for development

This applies migrations and seeding data needed for development
This applies migrations and seeding data needed for development.

```sh
npm run prisma migrate dev
```

### Optional: Restoring a database backup with test data

> [!NOTE]
> This step is very helpful to get a good starting point for developing and testing the frontend and/or the API. However, you may also skip it if you want to start with a clean database.
First, ask one of the Klimatkollen team members and they will send you a database backup.

Not required the first time: Delete the database to make sure it doesn't exist:

```sh
docker exec -i garbo_postgres dropdb -f -U postgres --if-exists garbo
```

Then, replace `~/Downloads/backup_garbo_XYZ.dump` with the path to your DB backup file and restore the database backup with the following command:

```sh
docker exec -i garbo_postgres pg_restore -C -v -d postgres -U postgres < ~/Downloads/backup_garbo_XYZ.dump
```

### Starting the Garbo project in development mode

The code consists of two different starting points. You can start both the BullMQ queue UI, the API and the workers concurrently using:
The code can be started in three main ways, depending on what you plan to develop/test/run locally.

#### 1) To serve only the API:

> [!NOTE]
> If you plan to develop the frontend and/or the API, this is the best way to get started:
```bash
npm run dev
npm run dev-api
```

This command will start both the dev-board and dev-workers concurrently. Now you can go to <http://localhost:3000> and see the dashboard.
#### 2) To start the AI pipeline, BullMQ admin dashboard and the API:

If you want to run them separately, use the following commands:
If you plan to develop the AI pipeline, this is the recommended way to start the code.

To serve the BullMQ queue UI and the API:
First, run the following command to start the API and the queue system, including an admin dashboard to view progress, logs and more.

```bash
npm run dev-board
```

To start the workers responsible for doing the actual work, which can be scaled horizontally:
Now you can go to <http://localhost:3000> and see the dashboard.

Then, open another terminal and start the AI pipeline and its workers, which are responsible for processing each report. These can be scaled horizontally.

```bash
npm run dev-workers
```

### Restoring a DB backup locally
#### 3) Starting everything concurrently

Get everything up and running with one command (with all output in one terminal).

```bash
npm run dev
```

### Setup completed 🎉

Well done! You've now set up the `garbo` backend and are ready to start development :)

---

### Testing DB migrations

These steps can be useful to test DB migrations, or develop with data that is similar to the that in the production environment
These steps can be useful to test DB migrations with data similar to the production environment.

1. Optional: Create a local test DB.
1. Recommended: Create a local test DB. This allows you to more easily

```sh
docker run -d -p 5432:5432 --name garbo_test_postgres -e POSTGRES_PASSWORD=mysecretpassword postgres
```

Alternatively, make sure your local postgres container is running.

2. Download the DB dump file. Ask someone from Klimatkollen to get one.
2. Ask one of the Klimatkollen team members and they will send you a database backup.

3. Delete the database if it exists:

```sh
docker exec -i garbo_test_postgres dropdb -f -U postgres --if-exists garbo
```

3. Restore the backup. This will initially connect to the default `postgres` database without making any modifications and then create any databases if they do not exist
4. Restore the backup. This will initially connect to the default `postgres` database without making any modifications and then create any databases if they do not exist

```sh
docker exec -i container_name pg_restore -U postgres -C -v -d postgres < ~/Downloads/backup_garbo_XYZ.dump
docker exec -i garbo_test_postgres pg_restore -C -v -d postgres -U postgres < ~/Downloads/backup_garbo_XYZ.dump
```

4. Apply DB migrations with `npm run prisma migrate dev`.
5. Test the DB migrations with `npm run prisma migrate dev`.

5. Restart the Garbo API and workers.
6. Restart the Garbo API and workers and verify the migration was successful.

### Testing

Expand Down Expand Up @@ -177,7 +231,7 @@ npm run prisma db seed # seed the data with initial content

### Operations / DevOps

This application is deployed in production with Kubernetes and uses FluxCD as CD pipeline. The yaml files in the k8s is automatically synced to the cluster. If you want to run a fork of the application yourself - just add these helm charts as dependencies:
This application is deployed in production with Kubernetes and uses FluxCD as CD pipeline. The yaml files in the k8s directory are automatically synced to the cluster. If you want to run a fork of the application yourself - just add these helm charts as dependencies:

```helm
postgresql (bitnami)
Expand Down
1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
"workers": "node --import tsx src/startWorkers.ts",
"dev-workers": "node --import tsx --watch src/startWorkers.ts",
"dev-board": "node --import tsx --watch src/index.ts",
"dev-api": "node --import tsx --watch src/index.ts --api-only",
"dev": "concurrently \"npm run dev-board\" \"npm run dev-workers\"",
"import": "node --import=tsx scripts/import-spreadsheet-companies.ts",
"test": "jest",
Expand Down
47 changes: 32 additions & 15 deletions src/index.ts
Original file line number Diff line number Diff line change
@@ -1,27 +1,44 @@
import express from 'express'
import { parseArgs } from 'node:util'

import queue from './queue'
import discord from './discord'
import api from './api'
import apiConfig from './config/api'

const { values } = parseArgs({
options: {
'api-only': {
type: 'boolean',
default: false,
},
},
})

const START_BOARD = !values['api-only']

const port = apiConfig.port
const app = express()

app.get('/favicon.ico', express.static('public/favicon.png'))
app.use('/api', api)
app.use('/admin/queues', queue)

app.get('/', (req, res) => {
res.send(
`Hi I'm Garbo!
Queues: <br>
<a href="/admin/queues">/admin/queues</a>`
)
})

app.listen(port, () => {
console.log(`Running on ${port}...`)
console.log(`For the UI, open http://localhost:${port}/admin/queues`)
discord.login()
if (START_BOARD) {
const queue = (await import('./queue')).default
app.use('/admin/queues', queue)
app.get('/', (req, res) => {
res.send(
`Hi I'm Garbo!
Queues: <br>
<a href="/admin/queues">/admin/queues</a>`
)
})
}

app.listen(port, async () => {
console.log(`API running at http://localhost:${port}/api/companies`)

if (START_BOARD) {
const discord = (await import('./discord')).default
console.log(`For the UI, open http://localhost:${port}/admin/queues`)
discord.login()
}
})
6 changes: 5 additions & 1 deletion src/lib/prisma.ts
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,11 @@ export async function upsertScope1(

export async function upsertScope2(
emissions: Emissions,
scope2: OptionalNullable<Omit<Scope2, 'id' | 'metadataId' | 'unit'>> | null,
scope2: {
lb?: number | null
mb?: number | null
unknown?: number | null
} | null,
metadata: Metadata
) {
if (scope2 === null) {
Expand Down
12 changes: 9 additions & 3 deletions src/routes/updateCompanies.ts
Original file line number Diff line number Diff line change
Expand Up @@ -329,13 +329,19 @@ export const emissionsSchema = z
.object({
mb: z
.number({ description: 'Market-based scope 2 emissions' })
.optional(),
.optional()
.nullable()
.describe('Sending null means deleting mb scope 2 emissions'),
lb: z
.number({ description: 'Location-based scope 2 emissions' })
.optional(),
.optional()
.nullable()
.describe('Sending null means deleting lb scope 2 emissions'),
unknown: z
.number({ description: 'Unspecified Scope 2 emissions' })
.optional(),
.optional()
.nullable()
.describe('Sending null means deleting unknown scope 2 emissions'),
})
.refine(
({ mb, lb, unknown }) =>
Expand Down

0 comments on commit e80b14e

Please sign in to comment.