1
submitted 1 year ago by [email protected] to c/[email protected]

https://public.dhe.ibm.com/ibmdl/export/pub/software/server/ibm-ai/conda/#/

On the face of it, the ability to run models larger than GPU memory would seem to be extremely valuable. Why did they give up? Not everyone has an 80GB GPU.

Was the performance too slow?

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here
this post was submitted on 03 Jul 2023
1 points (100.0% liked)

Machine Learning

1747 readers
1 users here now

founded 4 years ago
MODERATORS