ANN: Compute Servers -- CoCalc finally has GPU's and very powerful and affordable VM's #7048
williamstein
announced in
Announcements
Replies: 1 comment
-
Congrats! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
CoCalc now features robust compute servers, enabling users to connect a remote computer to CoCalc and utilize it for terminals and Jupyter notebooks. These compute servers open up possibilities for enhanced computing resources, extending far beyond the bounds of local machines. Users simply create a compute server in a project, select the software image and (optional) GPU they require, and can then start running any terminal or Jupyter notebook on this server for an on-demand fee, charged by the second when the server is in use.
The GPU support is extensive, offering variants including A100 80GB, A100 40GB, L4, and T4 GPUs with finely configured software stacks. These stack images include SageMath, Google Colab, Julia, PyTorch, Tensorflow and CUDA Toolkit, accommodating a versatile range of uses. The compute servers integrating these GPUs come at highly competitive pricing, particularly for spot instances. CoCalc's compute servers represent a massive enhancement to default projects, offering increased speed, flexibility, and computational power, transforming the way users can utilize CoCalc for their projects.
To set up a compute server in CoCalc, log into your project, create a compute server through the "Servers" button, selecting your desired software image and optionally a GPU. To use the server, create a terminal file or a Jupyter notebook, move it to the server through the upper left menu, and remember to sync files for editing during computations.
Finally, here is a quick tutorial on how to get started with compute servers on CoCalc:
Remember, compute servers are billed by the second only when they exist.
Beta Was this translation helpful? Give feedback.
All reactions