-
-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
⚡ feat: Static File Caching #3455
Conversation
1e96e2a
to
66cec19
Compare
66cec19
to
0f25dec
Compare
I see a potential issue with the compression implementation. For users like me who use Traefik with Brotli or Zstd compression, having Gzip at the application level prevents these more efficient methods from being applied. It would be better if we had something like a |
That's a good point @lidonius1122, I can add that. |
thanks! |
Done @lidonius1122 |
Seems to be working as it should so far @mawburn |
I've added notes in docs to mention this |
* add static file cache * disable compression env variable
* add static file cache * disable compression env variable
* add static file cache * disable compression env variable
* add static file cache * disable compression env variable
Summary
Related documentation PR: LibreChat-AI/librechat.ai#95
This pull request introduces static file caching and Gzip compression to the project, significantly enhancing its performance and efficiency. By implementing static file caching, we enable frequently accessed files like images, CSS, and JavaScript to be stored locally on the user's device, reducing the need for repeated server requests and decreasing load times. This is only triggered when
NODE_ENV
is set toproduction
.MDN Cache-Control
Additionally, the inclusion of compression reduces the size of these files before transmission, minimizing bandwidth usage and further speeding up page load times. Combined with the
Cache-Control
headers this should be a very negligible load on the server.These optimizations collectively lead to a faster, more responsive user experience, lower server load, and improved scalability, making the application more robust and user-friendly.
On my instance here are the results:
Using compression → 2.4mb transferred, from 6.2mb total
Using compression + local caching on a refresh → 664kb transferred, from 6.2mb total
With out these, the full 6.2mb was transferred on page load/refresh or were getting 304 responses from the API, causing extra load on the API server that isn't needed.
Update:
I have this deployed in our production for thousands of users and it seems to have also fixed our issues with memory usage.
This is with 8gb of memory and it was using 70-85% memory all the time. Where now it's about 15-18%.
Change Type
Checklist
Please delete any irrelevant options.
Copilot Summary
This pull request includes several changes to enhance the server's performance and improve caching mechanisms. The most important changes include adding the
compression
middleware, implementing a static file cache control utility, and updating the relevant configurations and routes.Performance Improvements:
api/package.json
: Added thecompression
package to enable response compression.api/server/index.js
: Integrated thecompression
middleware to compress response bodies for all requests. [1] [2]Cache Control Enhancements:
.env.example
: Added configuration options for static file cache control, includingSTATIC_CACHE_MAX_AGE
andSTATIC_CACHE_S_MAX_AGE
.api/server/utils/staticCache.js
: Created a new utility to handle static file caching with configurable cache headers based on environment variables.api/server/index.js
: Replacedexpress.static
with the newstaticCache
utility for serving static files. [1] [2]api/server/routes/static.js
: Updated the static routes to use the newstaticCache
utility.