Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR opening database connection for Delete Thread #851

Open
twitham1 opened this issue Feb 1, 2024 · 2 comments
Open

ERROR opening database connection for Delete Thread #851

twitham1 opened this issue Feb 1, 2024 · 2 comments

Comments

@twitham1
Copy link
Contributor

twitham1 commented Feb 1, 2024

  • Platform: Ubuntu 20.04.6 LTS

  • MythTV version: fixes/33

  • Package version: v33.1+fixes.202401260819.512d723c83~ubuntu20.04.1

  • Component: backend

What steps will reproduce the bug?

I ran contrib/maintenance/flush_deleted_recgroup.pl while Deleted contained 235 recordings. This caused backend to begin deleting one file every 4 seconds. After almost 100 successes in about 6.5 minutes, backend began complaining and failing to delete the rest of the recordings.

How often does it reproduce? Is there a required condition?

Possibly it ran up against this:

# cat /etc/mysql/mysql.conf.d/mythtv.cnf
[mysqld]
bind-address=::
max_connections=100

If so, we could reduce max_connections to reproduce this with fewer deletes.

What is the expected behaviour?

Expect serial deletes to use no more than 1 database connection at a time and all deletes to succeed.

What do you see instead?

Here is the last success and the first 2 fails which repeat for all pending deletes:

Jan 30 13:07:40 mythtv mythbackend: mythbackend[2980]: N DeleteThread mainserver.cpp:2538 (DeleteRecordedFiles) DeleteRecordedFiles - recording id 5930 filename /var/lib/mythtv1/recordings/1181_20200415173000.ts
Jan 30 13:07:40 mythtv mythbackend: mythbackend[2980]: N DeleteThread mainserver.cpp:2611 (DoDeleteInDB) DoDeleteINDB - recording id 5930 (chanid 1181 at 2020-04-15T17:30:00Z)
Jan 30 13:07:44 mythtv mythbackend: mythbackend[2980]: E DeleteThread mythdbcon.cpp:238 (OpenDatabase) [DBManager611] Unable to connect to database!
Jan 30 13:07:44 mythtv mythbackend: mythbackend[2980]: E DeleteThread mythdbcon.cpp:239 (OpenDatabase) Driver error was [1/1040]:#012QMYSQL: Unable to connect#012Database error was:#012Too many connections
Jan 30 13:07:44 mythtv mythbackend: mythbackend[2980]: E DeleteThread mainserver.cpp:2411 (DoDeleteThread) MainServer: ERROR opening database connection for Delete Thread for chanid 1184 recorded at 2020-04-20T13:30:00Z.  Program will NOT be deleted.
Jan 30 13:07:44 mythtv mythbackend: mythbackend[2980]: E DeleteThread mythdbcon.cpp:238 (OpenDatabase) [DBManager612] Unable to connect to database!
Jan 30 13:07:44 mythtv mythbackend: mythbackend[2980]: E DeleteThread mythdbcon.cpp:239 (OpenDatabase) Driver error was [1/1040]:#012QMYSQL: Unable to connect#012Database error was:#012Too many connections
Jan 30 13:07:44 mythtv mythbackend: mythbackend[2980]: E DeleteThread mainserver.cpp:2411 (DoDeleteThread) MainServer: ERROR opening database connection for Delete Thread for chanid 1184 recorded at 2020-04-21T13:30:00Z.  Program will NOT be deleted.

Additional information

At the time, lsof /var/run/mysqld/mysqld.sock had many handles connected to mysqld.

While I don't recall putting it there (maybe a package did?) this 100 connection limit has been in my config file since 2016 and seems reasonable - why would mythtv need > 100 connections to the database? Daily operation has always worked fine as it deletes only a few files per day. So perhaps the parallel connections time out and go away? The issue may appear only when queuing many deletes in a short time.

@twitham1
Copy link
Contributor Author

twitham1 commented Feb 1, 2024

I doubt it matters, but I will mention these recordings had deletepending set which made them invisible to frontend. Reference: https://forum.mythtv.org/viewtopic.php?f=36&t=2888&p=14057 and https://lists.mythtv.org/pipermail/mythtv-users/2024-January/413246.html

Possibly Related: should deletepending be automatically deleted? They are not visible to frontend, so they are unusable so why keep them? In my case they stole over 300G for over 3 years! Back then I had a few shows with unlimited episodes and I changed their rules to keep only 10 episodes. These invisible deletepending files appeared to come from that change, I wonder if possibly because the deletes failed due to the 100 connections.

@kmdewaal
Copy link
Contributor

It looks like I can reproduce the underlying problem, about the use of connections during delete.
Test sequence:

  • Create 10 small recordings with Live TV, about 10 seconds per channel
  • These recordings will be automatically deleted
  • Watch the total number of open connections with the following MySQL command:
    show status where variable_name = 'threads_connected';
  • On my system this value is 17 when mythbackend is running and mythfrontend only shows the menu
  • With Live TV the number increases to 23
  • When the backend does start to delete the 10 small recordings the number of connections does increase to 36 or so
  • After a while the number of connections goes slowly down again

So it looks like each delete action does create a new connection and that the old connections are closed much later. When there is a limit on the number of connections, as mentioned in this ticket, then this can cause the reported failure.

A quick inspection of the database code shows that it actually creates a pool of connections so that each new connection from MythTV should re-use an existing connection; when there is no connection that can be re-used then a new database connection is created. Connections from the pool are automatically deleted when they are not used for two hours.

This description of how the code works does not match with the observed temporary increase in the number of connections so I am obviously missing something....

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants