Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrating from other postgres container? #51

Closed
ministryofsillywalks opened this issue Jan 7, 2025 · 3 comments
Closed

Migrating from other postgres container? #51

ministryofsillywalks opened this issue Jan 7, 2025 · 3 comments

Comments

@ministryofsillywalks
Copy link

Problem/Motivation

I am running a postgres container on an unraid server which is going to be disabled soon.
I would like to move (mainly my long term stats) to the haos add-on.

Expected behavior

I created a backup of my old database which apparently didn't purge correctly and is now about 10gb large with detailed data going back several months instead of just 2 weeks.
Normally pgadmin allows you to upload backups to restore but only a few MBs.
I can't find the folder where this add-on stores its data so that I can transfer the backup file there directly for restoring.
Also I would love to clean up the backup but am unsure how to do this without losing my long term statistics
I know this isn't actually an issue but as there is no discussion enabled I didn't know where to post.
All help is very much appreciated! Thanks a ton!

Actual behavior

Old backup to large to upload via pgadmin

Steps to reproduce

Proposed changes

@expaso
Copy link
Owner

expaso commented Jan 7, 2025

Hi @ministryofsillywalks !!

First of all, thank you for using this addon!! 🙏🏻

No problem getting your data in (and out).
The addon by default mounts the /share (and /media and /backup) folders from the host into the addon.

So,.. If you simply place your 10gig backup-file on the /share folder of your homeassistant installation, you can pick it up there.

To get the file in /share in the first place, use an addon like Samba Share.

image

This will expose these folders as shares, so you can place your 10gb backup there:

image

From here, all these files are also visible from the timescaledb addon if you navigate to the /share directory:

image

This works also the other way around, so you can place backups there, and pick them up. I do this with a pgAgent Job:

image

Oh,, and for BONUS.. did yo notice the TestShare folder?

image

This folder comes from 'Network Storage' and it's actually mounted on my Synology NAS:

image

I use this to write backups created by pg_agent, directly to my NAS, without first storing it on my homeassistant.

Have a great time using the addon! 🎉

@ministryofsillywalks
Copy link
Author

Thanks a ton for this detailed answer! Really appreciate it.
How could I go about cleaning my old database before importing it. I really dont want to import 10gb of data spanning back 6 months which I don't even need...
I really just want the long term stats. Everything else I don't really care about.
Again, thanks a LOT!

@expaso
Copy link
Owner

expaso commented Jan 8, 2025

You're welcome!

I think you first need to restore the whole thing, and then delete the data you don't want.

The tables events and states are the ones you want to purge your old data from.

@expaso expaso closed this as completed Jan 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants