You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i have 500k data in mongodb, and everytime i restart keystonejs, it will delete all indexes i set in mongo shell before!
Sometimes getting a piece of data can even lead to a 400% usage rate of mongodb and the process is stuck.
and i use mongodb profile log to optimize database index,and when i restart keystonejs, all my work will disappear.
its something wrong in relationship, keystone is not good at big data, even 10K data, At least in using mongodb.
The text was updated successfully, but these errors were encountered:
bookyo
changed the title
i have 50k data in mongodb, and everytime i restart keystonejs, it will delete all indexes i set in mongo shell before!
i have 500k data in mongodb, and everytime i restart keystonejs, it will delete all indexes i set in mongo shell before!
Jan 17, 2021
Add isIndexed = true field to your schema fields and keystone will create indexes by itself. For many-to-many relations which are defining new collection like project_users_user_projects you can create indexes as usually from mongo shell, keystone will not refresh them.
But I was not able to create compound indexes at all as they are dropped after sever restart :(
I just had this happen with a compound index I created manually. As far as I can tell there is no way to create compound indices is KeystoneJS, which if true should be a non-starter to using KeystoneJS in the first place.
i have 500k data in mongodb, and everytime i restart keystonejs, it will delete all indexes i set in mongo shell before!
Sometimes getting a piece of data can even lead to a 400% usage rate of mongodb and the process is stuck.
and i use mongodb profile log to optimize database index,and when i restart keystonejs, all my work will disappear.
its something wrong in relationship, keystone is not good at big data, even 10K data, At least in using mongodb.
The text was updated successfully, but these errors were encountered: