48
submitted 4 months ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 0 points 4 months ago

It's already here. I run AI models via my GPU with training data from various sources for both searching/GPT-like chat and images. You can basically point-and-click and do this with GPT4All which integrates a chat client and let's you just select some popular AI models without knowing how to really do anything or use the CLI. It basically gives you a ChatGPT experience offline using your GPU if it has enough VRAM or CPU if it doesn't for whatever particular model you're using. It doesn't do images I don't think but there are other projects out there that simplify doing it using your own stuff.

[-] [email protected] 1 points 4 months ago

The m series Mac s with unified memory and ML cores are insanely powerful and much more flexible because your 32gb of system memory is now GPU vram etc

this post was submitted on 27 Apr 2024
48 points (94.4% liked)

Technology

58150 readers
4356 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS