Code Llama 70B on a dedicated server

0
admin
Aug 24, 2024 03:55 PM 0 Answers Ask Question
Member Since Apr 2019
Subscribed Subscribe Not subscribe
Flag(0)

Running Code Llama 70B effectively typically requires GPUs with substantial video memory. Using a server with only 128 GB of RAM would likely not be sufficient without dedicated GPUs, especially for decent performance. For best results, consider a setup with GPUs like RTX 4090s or Teslas, which support multi-GPU configurations and faster token generation.

0 Subscribers
Submit Answer
Please login to submit answer.
0 Answers
Sort By:

Share: