Skip to content

Ollama and LM Studio

Ollama and LM Studio are popular tools for running LLM on you own computer. However, they are not designed for shared clusters like Vera or Alvis. Their HTTP server doesn't have feature like API_KEY for authentication. They may not scale well on the cluster, either. If you would like to run LLM inference on Vera or Alvis, please refer to vLLM.

Ollama

We do not provide ollama on our clusters for the reasons mentioned above. Users can download the binary file from its official website if they can take the risk. Users should also set OLLAMA_MODELS to their project directory to prevent fill up the home directory.

LM Studio

LM Studio is installed because of its user-friendly graphical interface and support of offline inference (without launching HTTP server). Users can find it in the Menu > C3SE > LM Studio. Users should also change the model directory to their project directory.

lmstudio1 lmstudio2