Download an opensource LLM model in Raycluster yaml file?

Dear Team,

How can i download an opensource LLM model from hugging transformer in Raycluster yaml configuration? image: Package text-generation-inference · GitHub and where should i configure the MODEL name.

Can i configure this while deploying the ray-cluster? Your support truly helps
Not using Kubernetes


@ranimg You might be able to download as part of setup commands for each node in the cluster

Hi @Jules_Damji
I tried this way initially and i face 2 problems here

  1. During docker execution for this container, i always started getting time out exception. I couldn’t find solution to it. Exactly this issue (facing in Azure)
  2. Where exactly i would say the particular LLM model to be downloaded? Is this in the cluster.yaml or in the python wrapper?

Please let me know.