Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Leak setInterval #16488

Closed
Mortalife opened this issue Jan 18, 2025 · 8 comments
Closed

Memory Leak setInterval #16488

Mortalife opened this issue Jan 18, 2025 · 8 comments
Labels
bug Something isn't working memory leak

Comments

@Mortalife
Copy link

What version of Bun is running?

1.1.45+196621f25

What platform is your computer?

Linux 5.15.167.4-microsoft-standard-WSL2 x86_64 x86_64

What steps can reproduce the bug?

Describe the bug
I seem to have increasing memory from a setInterval in my code. I've run console.log(require("bun:jsc").heapStats()); And seen that the number of objects keeps increasing and with it the memory.

Managed to distil it down to a simple script.

To Reproduce

This is a simple setInterval, mean to represent doing "work".

// index.ts
setInterval(async () => {
  const time = Date.now();

  if (Date.now() - time > 100) {
    console.log("LAG");
  }

  if (time % 10 === 0) {
    console.log(
      "Memory usage: ",
      Math.trunc(process.memoryUsage.rss() / 1024 / 1024),
      "MB"
    );
  }
}, 100);

Put it in a file and run it with bun run index.ts

What is the expected behavior?

Expected behavior
I expect it not to leak.

What do you see instead?

Current Output
The current output over less than a minute:

Memory usage:  35 MB
Memory usage:  35 MB
Memory usage:  35 MB
Memory usage:  35 MB
Memory usage:  35 MB
Memory usage:  35 MB
Memory usage:  35 MB
Memory usage:  35 MB
Memory usage:  38 MB
Memory usage:  38 MB
Memory usage:  38 MB
Memory usage:  38 MB
Memory usage:  38 MB
Memory usage:  38 MB
Memory usage:  38 MB
Memory usage:  38 MB
Memory usage:  38 MB
Memory usage:  39 MB
Memory usage:  39 MB
Memory usage:  39 MB
Memory usage:  39 MB
Memory usage:  39 MB

Additional information

Additional context
Tried it in node 22, didn't leak.

@Mortalife Mortalife added bug Something isn't working needs triage labels Jan 18, 2025
@ArashOrangi
Copy link

ArashOrangi commented Jan 20, 2025

I had the same issue, and after testing a few APIs and adding middleware to detect memory leaks, I realized something interesting. Regardless of the framework—whether it's HONO or Express—everything works fine when running with Node.js. However, when I run the same code with Bun, I consistently encounter memory leaks.

you can test (monitor) it with add this code:

app.use('*', async (c, next) => {
    const memoryBefore = process.memoryUsage();
    await next();
    const memoryAfter = process.memoryUsage();

    console.log(`[Memory Usage for ${c.req.method} ${c.req.url}]:`, {
        before: {
            rss: `${(memoryBefore.rss / 1024 / 1024).toFixed(2)} MB`,
            heapUsed: `${(memoryBefore.heapUsed / 1024 / 1024).toFixed(2)} MB`,
        },
        after: {
            rss: `${(memoryAfter.rss / 1024 / 1024).toFixed(2)} MB`,
            heapUsed: `${(memoryAfter.heapUsed / 1024 / 1024).toFixed(2)} MB`,
        },
    });
});

Test result in BUN:

<-- POST /api/search/provider
prisma:info Starting a postgresql pool with 9 connections.
[Memory Usage for POST http://localhost:3001/provider]: {
  before: {
    rss: "283.54 MB",
    heapUsed: "12.05 MB",
  },
  after: {
    rss: "320.93 MB",
    heapUsed: "13.61 MB",
  },
}
--> POST /api/search/provider 200 2s
<-- POST /api/search/provider
[Memory Usage for POST http://localhost:3001/provider]: {
  before: {
    rss: "303.11 MB",
    heapUsed: "19.92 MB",
  },
  after: {
    rss: "326.79 MB",
    heapUsed: "19.95 MB",
  },
}
--> POST /api/search/provider 200 822ms
<-- POST /api/search/provider
[Memory Usage for POST http://localhost:3001/provider]: {
  before: {
    rss: "326.80 MB",
    heapUsed: "25.01 MB",
  },
  after: {
    rss: "333.27 MB",
    heapUsed: "25.74 MB",
  },
}
--> POST /api/search/provider 200 901ms
<-- POST /api/search/provider
[Memory Usage for POST http://localhost:3001/provider]: {
  before: {
    rss: "331.24 MB",
    heapUsed: "30.93 MB",
  },
  after: {
    rss: "337.97 MB",
    heapUsed: "17.07 MB",
  },
}
--> POST /api/search/provider 200 929ms
<-- POST /api/search/provider
[Memory Usage for POST http://localhost:3001/provider]: {
  before: {
    rss: "337.97 MB",
    heapUsed: "22.30 MB",
  },
  after: {
    rss: "343.75 MB",
    heapUsed: "22.31 MB",
  },
}
--> POST /api/search/provider 200 828ms
<-- POST /api/search/provider
[Memory Usage for POST http://localhost:3001/provider]: {
  before: {
    rss: "348.54 MB",
    heapUsed: "27.36 MB",
  },
  after: {
    rss: "351.51 MB",
    heapUsed: "27.36 MB",
  },
}
--> POST /api/search/provider 200 868ms
<-- POST /api/search/provider
[Memory Usage for POST http://localhost:3001/provider]: {
  before: {
    rss: "349.48 MB",
    heapUsed: "17.27 MB",
  },
  after: {
    rss: "353.96 MB",
    heapUsed: "18.55 MB",
  },
}

@shaffel
Copy link

shaffel commented Jan 21, 2025

We have an application that receives 500,000 daily requests, and it has encountered the exact same memory leak issue. To confirm that the memory leak was specific to Bun, we ran the application with Node.js and Deno as well. However, the memory leak only occurred with Bun. To prevent server downtime, we've been restarting the server every few hours. This is a critical issue.

@Jarred-Sumner
Copy link
Collaborator

Jarred-Sumner commented Jan 21, 2025

@Mortalife we have memory leak tests for setInterval in CI that run on every commit of Bun.

Memory usage:  35 MB
...
Memory usage:  39 MB

Fluctuations of 4 megabytes do not indicate a memory leak - its normal for memory usage to fluctuate by ~15 MB or more, depending on the precise timing of memory allocations, how big the heap size currently is, and how much memory the machines have available. If you can make it go up 50 MB in a setInterval like the above, please let us know. If you're actively running into a memory leak, please file a new issue with detail about the specifics of what you're running into.

Regardless, I ran the script for 10 minutes and it fluctuated between 38 MB and 40 MB.

@shaffel it is unlikely that setInterval is the cause here. I suggest doing a V8 heap snapshot (via require("v8").writeHeapSnapshot()) and looking at it in Chrome DevTools. Also, console.log(require("bun:jsc").heapStats()) can be a source of useful information. Please open an issue with more details about what you're seeing

prisma:info Starting a postgresql pool with 9 connections.
[Memory Usage for POST http://localhost:3001/provider]: {

@ArashOrangi the issue you're running into is most likely related to napi in Bun when using Prisma. We measurably made improvements there after Bun v1.1.36 or so, but memory usage with Prisma is still not where it needs to be. It is unrelated to setInterval - feel free to open a separate issue.

@Jarred-Sumner Jarred-Sumner closed this as not planned Won't fix, can't repro, duplicate, stale Jan 21, 2025
@Mortalife
Copy link
Author

Mortalife commented Jan 21, 2025

Regardless, I ran the script for 10 minutes and it fluctuated between 38 MB and 40 MB.

Then I apologise for giving a bad example, but the issue present. I'll add a more robust example. In this example the "work" is a little more substantial, you can see the GC kick in but it's still creaping up.

2025-01-21T10:32:00.590Z Memory usage:  34 MB
2025-01-21T10:32:10.632Z Memory usage:  47 MB
2025-01-21T10:32:20.644Z Memory usage:  57 MB
2025-01-21T10:32:30.661Z Memory usage:  61 MB
2025-01-21T10:32:40.682Z Memory usage:  68 MB
2025-01-21T10:32:50.692Z Memory usage:  74 MB
2025-01-21T10:33:00.714Z Memory usage:  77 MB
2025-01-21T10:33:10.737Z Memory usage:  84 MB
2025-01-21T10:33:20.754Z Memory usage:  89 MB
2025-01-21T10:33:30.777Z Memory usage:  92 MB
2025-01-21T10:33:40.801Z Memory usage:  99 MB
2025-01-21T10:33:50.820Z Memory usage:  105 MB
2025-01-21T10:34:00.844Z Memory usage:  109 MB
2025-01-21T10:34:10.868Z Memory usage:  50 MB
2025-01-21T10:34:20.884Z Memory usage:  51 MB
2025-01-21T10:34:30.908Z Memory usage:  53 MB
2025-01-21T10:34:40.934Z Memory usage:  54 MB
2025-01-21T10:34:50.955Z Memory usage:  57 MB
2025-01-21T10:35:00.979Z Memory usage:  64 MB
2025-01-21T10:35:10.999Z Memory usage:  71 MB
2025-01-21T10:35:21.021Z Memory usage:  58 MB
2025-01-21T10:35:31.046Z Memory usage:  66 MB
2025-01-21T10:35:41.070Z Memory usage:  72 MB
2025-01-21T10:35:51.088Z Memory usage:  57 MB
2025-01-21T10:36:01.111Z Memory usage:  59 MB
2025-01-21T10:36:11.130Z Memory usage:  61 MB
2025-01-21T10:36:21.147Z Memory usage:  63 MB
2025-01-21T10:36:31.169Z Memory usage:  70 MB
2025-01-21T10:36:41.188Z Memory usage:  76 MB
2025-01-21T10:36:51.207Z Memory usage:  63 MB
2025-01-21T10:37:01.226Z Memory usage:  70 MB
2025-01-21T10:37:11.246Z Memory usage:  76 MB
2025-01-21T10:37:21.261Z Memory usage:  63 MB
2025-01-21T10:37:31.283Z Memory usage:  70 MB
2025-01-21T10:37:41.301Z Memory usage:  77 MB
2025-01-21T10:37:51.313Z Memory usage:  63 MB
2025-01-21T10:38:01.333Z Memory usage:  64 MB
2025-01-21T10:38:11.359Z Memory usage:  65 MB
2025-01-21T10:38:21.381Z Memory usage:  67 MB
2025-01-21T10:38:31.406Z Memory usage:  74 MB
2025-01-21T10:38:41.437Z Memory usage:  81 MB
2025-01-21T10:38:51.460Z Memory usage:  67 MB
2025-01-21T10:39:01.487Z Memory usage:  70 MB
2025-01-21T10:39:11.512Z Memory usage:  70 MB
2025-01-21T10:39:21.533Z Memory usage:  71 MB
2025-01-21T10:39:31.560Z Memory usage:  76 MB
2025-01-21T10:39:41.582Z Memory usage:  83 MB
2025-01-21T10:39:51.597Z Memory usage:  90 MB
2025-01-21T10:40:01.620Z Memory usage:  76 MB
2025-01-21T10:40:11.638Z Memory usage:  83 MB
2025-01-21T10:40:21.654Z Memory usage:  90 MB
2025-01-21T10:40:31.675Z Memory usage:  76 MB
2025-01-21T10:40:41.699Z Memory usage:  83 MB
2025-01-21T10:40:51.721Z Memory usage:  90 MB
2025-01-21T10:41:01.743Z Memory usage:  77 MB
2025-01-21T10:41:11.766Z Memory usage:  83 MB
2025-01-21T10:41:21.784Z Memory usage:  90 MB
2025-01-21T10:41:31.805Z Memory usage:  77 MB
2025-01-21T10:41:41.826Z Memory usage:  83 MB
2025-01-21T10:41:51.850Z Memory usage:  90 MB
2025-01-21T10:42:01.874Z Memory usage:  76 MB
2025-01-21T10:42:11.895Z Memory usage:  83 MB
2025-01-21T10:42:21.921Z Memory usage:  91 MB
2025-01-21T10:42:31.944Z Memory usage:  73 MB
2025-01-21T10:42:41.967Z Memory usage:  74 MB
2025-01-21T10:42:51.983Z Memory usage:  75 MB
2025-01-21T10:43:02.004Z Memory usage:  80 MB
2025-01-21T10:43:12.028Z Memory usage:  88 MB
2025-01-21T10:43:22.057Z Memory usage:  94 MB
2025-01-21T10:43:32.080Z Memory usage:  77 MB
2025-01-21T10:43:42.120Z Memory usage:  78 MB
2025-01-21T10:43:52.156Z Memory usage:  80 MB
2025-01-21T10:44:02.187Z Memory usage:  84 MB
2025-01-21T10:44:12.225Z Memory usage:  91 MB
2025-01-21T10:44:22.269Z Memory usage:  97 MB
2025-01-21T10:44:32.307Z Memory usage:  85 MB
2025-01-21T10:44:42.351Z Memory usage:  92 MB
2025-01-21T10:44:52.389Z Memory usage:  98 MB
2025-01-21T10:45:02.436Z Memory usage:  85 MB
2025-01-21T10:45:12.480Z Memory usage:  91 MB
2025-01-21T10:45:22.524Z Memory usage:  97 MB
2025-01-21T10:45:32.552Z Memory usage:  85 MB
2025-01-21T10:45:42.585Z Memory usage:  92 MB
2025-01-21T10:45:52.611Z Memory usage:  99 MB
2025-01-21T10:46:02.641Z Memory usage:  85 MB
2025-01-21T10:46:12.674Z Memory usage:  92 MB
2025-01-21T10:46:22.713Z Memory usage:  99 MB
2025-01-21T10:46:32.760Z Memory usage:  85 MB
2025-01-21T10:46:42.800Z Memory usage:  92 MB
2025-01-21T10:46:52.842Z Memory usage:  99 MB
2025-01-21T10:47:02.881Z Memory usage:  86 MB
2025-01-21T10:47:12.914Z Memory usage:  92 MB
2025-01-21T10:47:22.954Z Memory usage:  99 MB
2025-01-21T10:47:32.993Z Memory usage:  85 MB
2025-01-21T10:47:43.037Z Memory usage:  84 MB
2025-01-21T10:47:53.055Z Memory usage:  86 MB
2025-01-21T10:48:03.087Z Memory usage:  90 MB
2025-01-21T10:48:13.129Z Memory usage:  96 MB
2025-01-21T10:48:23.171Z Memory usage:  103 MB
2025-01-21T10:48:33.211Z Memory usage:  90 MB
2025-01-21T10:48:43.258Z Memory usage:  97 MB
2025-01-21T10:48:53.294Z Memory usage:  104 MB
2025-01-21T10:49:03.338Z Memory usage:  91 MB
2025-01-21T10:49:13.376Z Memory usage:  97 MB
const doWork = async (time: number) => {
  const amount = Math.floor(Math.random() * 1000);
  const mapping = new Map();

  for (let i = 0; i < amount; i++) {
    const key = Math.floor(Math.random() * 100000);
    mapping.set(key, key);
  }
};

let i = 0;
setInterval(async () => {
  const time = Date.now();

  await doWork(time);

  if (Date.now() - time > 100) {
    console.log("LAG");
  }

  if (i % 100 === 0) {
    i = 0
    console.log(
      new Date().toISOString(),
      "Memory usage: ",
      Math.trunc(process.memoryUsage.rss() / 1024 / 1024),
      "MB",
    );
  }

  i++;
}, 100);

@shaffel
Copy link

shaffel commented Jan 21, 2025

Our problem is very similar to @Mortalife's problem.

@190n
Copy link
Contributor

190n commented Jan 21, 2025

@Mortalife Thank you for the new example, but I don't believe it demonstrates a memory leak in Bun. I ran it for over an hour on a Linux machine, and here's a graph of the memory usage over time. The X axis is time in seconds, and the Y axis is memory usage in MiB:

Image

To me it seems clear that, while there are periods of time during which the memory usage increases, it always eventually drops once the garbage collector runs. We do probably need to run garbage collection more often in cases like this, but there doesn't seem to be a true memory leak because the memory is always freed eventually.

I've been working on PR #15557 which makes our GC more aggressive and should improve memory usage in cases like this. Below is a chart comparing the same results from above (Bun v1.1.45, the black line) with your same test running on a build of Bun from that PR (the blue line). You can try this yourself by running bunx bun-pr 15557 and then using bun-15557 to run the test, although you shouldn't use the PR build in production because there are still regressions. I didn't let the PR build run for as long, but it does seem to be reducing memory usage a lot by running GC more often:

Image

I'm hoping to get that PR merged soon, but in the meantime, if you need GC to run more often to reduce memory usage you can call the Bun.gc function described here to manually trigger garbage collection. You can also use that function when testing for memory leaks in cases like this: if running GC more makes the problem disappear, it's likely that you're not looking at a true leak but rather a case where GC is not freeing memory quite as soon as it theoretically could.

@Mortalife
Copy link
Author

Mortalife commented Jan 21, 2025

Thanks for looking into this, it's possible that the memory leak I was experiencing was from something else that was occurring within the setInterval. It went from 30mb to 2gb in less than 12 hours so it was definitely a leak - if you're happy that the existence isn't within the setInterval itself then I'll create a new ticket if I notice anything new. I've since migrated to Node 22 where the issue isn't occurring so my desire to invest any more time into this is minimal.

Image

@190n
Copy link
Contributor

190n commented Jan 21, 2025

I see. Sorry that Bun can't meet your needs in production.

Do the dropoffs in that chart represent the memory usage of a Bun process decreasing, or do those happen because Bun uses too much memory, gets killed, and restarts with a lower memory usage?

Given the numbers you've shared, I agree it's more likely for there to be a real leak somewhere in Bun, but it still isn't a certainty. If you ever have more time to investigate this I'll be curious what memory usage looks like if you explicitly make GC run more often (you shouldn't need to do that to get acceptable memory usage, but seeing the memory usage after doing that will narrow down what could be making the usage too high).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working memory leak
Projects
None yet
Development

No branches or pull requests

6 participants