You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is more a question than an issue. We are running some workloads requiring bath processing of records. To speed up the process multiple workers are used.
The problem is each new worker requires 2GB of memory to load libpostal data. This seems wasteful, since the data AFAIK doesn't change. Effectively this cause memory to have multiple copies of libpostal model, while also significantly delaying startup of each of the workers.
Is there any way to share the memory containing data model between workers, or is making a simple REST api our only option?
The text was updated successfully, but these errors were encountered:
This is more a question than an issue. We are running some workloads requiring bath processing of records. To speed up the process multiple workers are used.
The problem is each new worker requires 2GB of memory to load
libpostal
data. This seems wasteful, since the data AFAIK doesn't change. Effectively this cause memory to have multiple copies oflibpostal
model, while also significantly delaying startup of each of the workers.Is there any way to share the memory containing data model between workers, or is making a simple REST api our only option?
The text was updated successfully, but these errors were encountered: