Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible thread leak in JMX Fetch 0.48.0 #521

Open
phongvq opened this issue May 23, 2024 · 1 comment
Open

Possible thread leak in JMX Fetch 0.48.0 #521

phongvq opened this issue May 23, 2024 · 1 comment

Comments

@phongvq
Copy link

phongvq commented May 23, 2024

Hi,

We have observed below thread leaks in our server. We suspect the server thread leak is due to JMX connection timeouts on Jmxfetch. Number of JMX Connection timeouts threads keep increasing (this number backing down at some point, but too small compared to increasing number - see bellow)

zgrep -c 'JMX server connection' 20240512/*_19*.dump.gz
20240512/jstack20240512_1900.dump.gz:3676
20240512/jstack20240512_1901.dump.gz:3701
20240512/jstack20240512_1902.dump.gz:3684
20240512/jstack20240512_1903.dump.gz:3688
20240512/jstack20240512_1904.dump.gz:3713
20240512/jstack20240512_1905.dump.gz:3696
20240512/jstack20240512_1906.dump.gz:3700
20240512/jstack20240512_1907.dump.gz:3725
20240512/jstack20240512_1908.dump.gz:3718
20240512/jstack20240512_1909.dump.gz:3712
20240512/jstack20240512_1910.dump.gz:3728
20240512/jstack20240512_1911.dump.gz:3741
20240512/jstack20240512_1912.dump.gz:3724
20240512/jstack20240512_1913.dump.gz:3730
20240512/jstack20240512_1914.dump.gz:3753
20240512/jstack20240512_1915.dump.gz:3736
20240512/jstack20240512_1916.dump.gz:3740
20240512/jstack20240512_1917.dump.gz:3765
20240512/jstack20240512_1918.dump.gz:3748
20240512/jstack20240512_1919.dump.gz:3752
20240512/jstack20240512_1920.dump.gz:3777
20240512/jstack20240512_1921.dump.gz:3763
20240512/jstack20240512_1922.dump.gz:3764
20240512/jstack20240512_1923.dump.gz:3787
20240512/jstack20240512_1924.dump.gz:3786
20240512/jstack20240512_1925.dump.gz:3776
20240512/jstack20240512_1926.dump.gz:3788
20240512/jstack20240512_1927.dump.gz:3805
20240512/jstack20240512_1928.dump.gz:3788
20240512/jstack20240512_1929.dump.gz:3792
20240512/jstack20240512_1930.dump.gz:3817
20240512/jstack20240512_1931.dump.gz:3800
20240512/jstack20240512_1932.dump.gz:3804
20240512/jstack20240512_1933.dump.gz:3829
20240512/jstack20240512_1934.dump.gz:3812
20240512/jstack20240512_1935.dump.gz:3816
20240512/jstack20240512_1936.dump.gz:3841
20240512/jstack20240512_1937.dump.gz:3834
20240512/jstack20240512_1938.dump.gz:3828
20240512/jstack20240512_1939.dump.gz:3844

Java app thread dump (where JMX remote enabled), we got many of these:

"JMX server connection timeout 10668711" #10668711 daemon prio=5 os_prio=0 cpu=5609.35ms elapsed=149027.94s tid=0x00007f31bc97c000 nid=0x70b5 in Object.wait()  [0x00007f2de434c000]
   java.lang.Thread.State: TIMED_WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <no object reference available>
	at com.sun.jmx.remote.internal.ServerCommunicatorAdmin$Timeout.run([email protected]/ServerCommunicatorAdmin.java:171)
	- waiting to re-lock in wait() <0x00000010293bbe98> (a [I)
	at java.lang.Thread.run([email protected]/Thread.java:829)

...

JMX Fetch logs

2024-05-22 01:24:57 PDT | JMX | WARN | JmxAttribute | Unable to get metrics from xxxxx - MaxActive: java.lang.NumberFormatException: For input string: "Local:1 
Remote:1"
2024-05-22 01:24:57 PDT | JMX | WARN | JmxAttribute | Unable to get metrics from xxxx: java.lang.NumberFormatException: For input string: "CacheStats{hitCount=0, missCount=0, loadSuccessCount=0, loadExceptionCount=0, totalLoadTime=0, evictionCount=0}"
2024-05-22 01:24:57 PDT | JMX | WARN | JmxAttribute | Unable to get metrics from xxxx : java.lang.NumberFormatException: empty String
2024-05-22 01:24:58 PDT | JMX | WARN | App | Unable to instantiate or initialize instance localhost:3000 for an unknown reason.Java heap space

JMX Fetch version: 0.48.0

Possibly similar behavior to #370

@jk2l
Copy link

jk2l commented Oct 16, 2024

I wonder would this be related to my out of memory error on #519 that it just simply die few weeks after launch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants