Any way to borrow compute from Apple M1

0
admin
Aug 24, 2024 03:55 PM 0 Answers Ask Question
Member Since Apr 2019
Subscribed Subscribe Not subscribe
Flag(0)

To borrow compute from an Apple M1 Max for tasks like running LLaMA 3 or Stable Diffusion, you can set up a local HTTP service on your friend’s laptop using Ollama. Use Tailscale or ZeroTier for remote access. This allows API calls without remote desktop access, leveraging the GPU when idle while managing memory effectively.

0 Subscribers
Submit Answer
Please login to submit answer.
0 Answers
Sort By:

Share: