-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance issues #118
Comments
Did some benchmarking around queue job inserting using BullMQ and got the following results:
Why inserting in batch is more expensive than inserting one by one? I don't know, but it is. My guesses about the memory usage being around the creation of a big recipients array in the JavaScript side were wrong, the problem was inserting to queue. |
Did some benchmarks with Bee Queue, but it didn't result in a better performance, actually it changed nothing. |
Removed two queries when receiving notifications from Amazon SNS. Now we don't need to check for recipient existence nor create the events separatelly from upserting the recipient. |
Split the SNS webhook HTTP service on c810331. |
Touchy subject, memory is usually considerably cheaper than vCPU[0]. Some numbers around the CPU usage in both tests would be nice! It's also easier to manage/scale/allocate memory than CPU. Besides all that, applications in Node.js tend to consume more CPU than memory. Since we [currently] work almost exclusively with this technology, depending on the usage of each resource, it may be worth to sacrifice some RAM. (: [0]
|
Hey @pellizzetti, I tried making some CPU benchmarking but I couldn't manage to get the actual results of CPU consumption. Do you have any tips for me on that? |
If you created new, isolated scripts for your benchmarking, you won't need to handle startup/setup time, so, I guess a simple |
If you want to take a deeper dive, take some time to look over |
thought this would be a fun quest, decided to jump in and I got your answer :)
here is the full story:
const result = (await multi.exec()) as [null | Error, string][]; this basically loops for all the 20k objects that we just added, checking some flags and building a then it loops again all the 20k objects and sends one by one to redis so, there is a bunch of loops in all data:
in the end, you have a bunch of copies of the original array; every performance issues are tricky; you might think that using to understand this kind of issue you must dig into the code, read and understand what folks are doing, a base ground rule when working with big data in javascript;
✌️ |
I sent an email for 18k contacts and noticed some CPU and memory problems.
Summary
Errors
When receiving some AWS events, the HTTP service was logging some errors with this trace:
Remediations
Infrastructure
Application
The text was updated successfully, but these errors were encountered: