-
Notifications
You must be signed in to change notification settings - Fork 53
Database maintenance
This page helps guide you through maintaining your local database.
When the DB schema changes, the backend may throw an error and fail to start.
Restarting the docker way:
- run
cd backend
- Bring down the service by running
docker-compose down
- Wipe the DB by running
docker system prune && docker images prune && docker volume prune
- Restart the service
docker-compose up --build
Restarting the SQL way:
- In the backend directory, run
db-setup/nuke-db.sh
- We need to delete one of the db view roles so that the migration written to add it can recreate it. Delete the
simple_report_no_phi
database role by running-
docker exec -it db psql -U postgres -a simple_report
to get in to psql within your docker container - Running
DROP ROLE IF EXISTS simple_report_no_phi;
and thenquit
to get out of psql.
-
- Restart the spring app
gradle bootRun --args='--spring.profiles.active=dev'
or via the provided backend run configs.- If you regularly run the Okta-enabled backend, you may want to run the one without Okta first so that you reload the default dummy orgs into your local db.
The application uses the Liquibase plugin for Gradle to perform specific database management tasks.
To roll the database back to its state at a prior date:
docker compose run --rm backend gradle liquibaseRollbackToDate -PliquibaseCommandValue=${date}
or
$ ./gradlew liquibaseRollbackToDate -PliquibaseCommandValue=${date}
To roll back a certain number of migrations:
docker compose run --rm backend gradle liquibaseRollbackCount -PliquibaseCommandValue=${n}
or
$ ./gradlew liquibaseRollbackCount -PliquibaseCommandValue=${n}
To roll back to a specific tag:
docker compose run --rm backend gradle liquibaseRollback -PliquibaseCommandValue=${TAG}
or
$ ./gradlew liquibaseUpdateToTag -PliquibaseCommandValue=${TAG}
If you are required to roll back a non-local database, you may generate the necessary SQL to execute elsewhere. Use liquibaseRollbackToDateSQL
or liquibaseRollbackCountSQL
in the manner described above to write the rollback SQL to stdout.
Our changelog contains some migrations that make use of get_census_dob_group(...)
these migrations are not able to be rolled back because they need to specify the schema and thus be rewritten to ${database.defaultSchemaName}.get_census_dob_group(...)
, rewriting the changelog will require us to manually recompute the checksum inside the databasechangelog
table for the affected migrations per every environment and will require some downtime.
To get around this issue, we decided only to test the last 20 rollbacks; that way, we don't hit the problematic migrations. As you can see here
We don't see a case where we will ever rollback to the problematic migrations. This compromise is acceptable.
Using the database tool in IntelliJ, you can view and modify entries in your database. The database tool window can be opened by clicking Database
on the right panel of IntelliJ or by going to View > Tool Window > Database
To add your database, go to + > Data Source > Postgres
. Set the jdbc URL to jdbc:postgresql://localhost:5432/simple_report
, the user to simple_report_app
and the password to api123
. In the schema tab, ensure the correct schema is selected. Typically you can just set the Schema Pattern to simple_report:simple_report
. The connection may now be tested; you can apply the configuration if it's valid.
To access the data, you may go to simple_report@localhost > simple_report > tables
and double-click the table you wish to view.
To access data via query, you can open the query console and run any query. For example, to get all test_events you can use this query select * from simple_report.test_event;
To view tables/entries in your local database, team members currently use Postico and DataGrip. We recommend you use something similar to view/work with your local data.
To connect to the Postgres server, use the following credentials.
- username: simple_report_app
- password: api123
- port: 5432
- database: simple_report
- Click into the database that needs to be restored
- Click into the "Backup and restore" blade
- Select the backup that you want to restore to (check the dates but the highest number is the latest)
- Click the "fast restore" link
- Set a name to the database you want to restore
- Click "Review + Create" OR update DB settings if you know what you're doing.
- Click "Create"
** Please be sure to check Terraform plans before applying them to any environment you've upgraded. Depending on what you've done you may either need to update the Terraform configuration or remove/import state to get things lined back up **
- If this is planned, I suggest you wait for latest backup to prod. Azure sets the daily backup time internally, this is not something we control.
- Create a db instance from a db server backup
- Name it after the environment you're working in
test-db-backup
- Wait until the server is running and azure has created a backup from the
test-db-backup
instance before moving on - Upgrade the database server either by using
pg_dump/pg_restore
or by using fancy azure portal button - If you're sure the upgrade went well, delete the backup instance
test-db-backup
- If something with the upgrade went terribly wrong and you want your old version back
- Delete the db instance you just upgraded (this needs to happen because of name collisions, we want to keep all our original settings)
- Restore a new instance with the original name from the
test-db-backup
backup instance you created earlier
- Getting Started
- [Setup] Docker and docker compose development
- [Setup] IntelliJ run configurations
- [Setup] Running DB outside of Docker (optional)
- [Setup] Running nginx locally (optional)
- [Setup] Running outside of docker
- Accessing and testing weird parts of the app on local dev
- Accessing patient experience in local dev
- API Testing with Insomnia
- Cypress
- How to run e2e locally for development
- E2E tests
- Database maintenance
- MailHog
- Running tests
- SendGrid
- Setting up okta
- Sonar
- Storybook and Chromatic
- Twilio
- User roles
- Wiremock
- CSV Uploader
- Log local DB queries
- Code review and PR conventions
- SimpleReport Style Guide
- How to Review and Test Pull Requests for Dependabot
- How to Review and Test Pull Requests with Terraform Changes
- SimpleReport Deployment Process
- Adding a Developer
- Removing a developer
- Non-deterministic test tracker
- Alert Response - When You Know What is Wrong
- What to Do When You Have No Idea What is Wrong
- Main Branch Status
- Maintenance Mode
- Swapping Slots
- Monitoring
- Container Debugging
- Debugging the ReportStream Uploader
- Renew Azure Service Principal Credentials
- Releasing Changelog Locks
- Muting Alerts
- Architectural Decision Records
- Backend Stack Overview
- Frontend Overview
- Cloud Architecture
- Cloud Environments
- Database ERD
- External IDs
- GraphQL Flow
- Hibernate Lazy fetching and nested models
- Identity Verification (Experian)
- Spring Profile Management
- SR Result bulk uploader device validation logic
- Test Metadata and how we store it
- TestOrder vs TestEvent
- ReportStream Integration
- Feature Flag Setup
- FHIR Resources
- FHIR Conversions
- Okta E2E Integration
- Deploy Application Action
- Slack notifications for support escalations
- Creating a New Environment Within a Resource Group
- How to Add and Use Environment Variables in Azure
- Web Application Firewall (WAF) Troubleshooting and Maintenance
- How to Review and Test Pull Requests with Terraform Changes