You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
On our discord people often ask "How can I access AnythingLLM from my mobile phone or different machine"
Users can access AnythingLLM by exposing their Localhost to the internet by using something like NGrok or Cloudlfare tunnel but there is no guide or tutorials on youtube so it is good to write a guide about this
The text was updated successfully, but these errors were encountered:
My only reservation on this is it is not a really "safe" way to run AnythingLLM in a more "server-like" container. They should really use container cloud services like Render or Railway because if they tunnel their local computer they are basically opening a tunnel to their desktop computer and if the computer goes to sleep the application wont work anywa
@timothycarambat I don't have much knowledge in this area 😄, I saw alot of people asking on how to do it so maybe we can write the guide on how to do it and add a warning message at the beginning of the guide.
That provider is different from the primary issue that this is under. That LLM is basically an "open" OpenAI provider that basically allows you to use providers we don yet have built-in support for. But yes, that warning would be appropriate
On our discord people often ask "How can I access AnythingLLM from my mobile phone or different machine"
Users can access AnythingLLM by exposing their Localhost to the internet by using something like NGrok or Cloudlfare tunnel but there is no guide or tutorials on youtube so it is good to write a guide about this
The text was updated successfully, but these errors were encountered: