30
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 27 Sep 2024
30 points (96.9% liked)
Hardware
497 readers
233 users here now
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- AMD
- Gaming Laptops
- Laptops
- History of Computing Hardware
- Linux Hardware
- Mechanical Keyboards
- Microcontrollers
- Monitors
- Single Board Computers
Icon by "icon lauk" under CC BY 3.0
founded 1 year ago
MODERATORS
32gb VRAM would suck. Then I'll not buy a 5090. A 5080 with only 16GB of GDDR7 would also suck. Better get a 3090 with 24gb then.
The only thing keeping 4080(and 5080) cards "reasonably" priced is the fact that they only have 16GB, therefore they arent that good for ai shit. You dont need more than 16gB vram for gaming. If those cards had more vram, the ai datacenters would pick them up, keeping their price even higher than it is.
For VR, you do already.