this post was submitted on 30 Apr 2024
141 points (91.2% liked)
Technology
59340 readers
5442 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Why call out Intel? Pretty sure AMD and Nvidia are both putting dedicated AI hardware in all of their new and upcoming product lines. From what I understand they are even generally doing it better than Intel. Hell, Qualcomm is advertising their AI performance on their new chips and so is Apple. I don't think there is anyone in the chip world that isn't hopping on the AI train
Because I was only aware of Intel (and Apple) doing it on computers, whereas most major flagship mobile devices have those accelerators now.
GPUs were excluded, since they're not as universal as processors are. A dedicated video card is still by and large considered an enthusiast part.
Fair enough. Was just asking because the choice of company surprised me. AMD is putting "AI Engines in their new CPUs (separate silicon design from their GPUs) and while Nvidia largely only sells GPUs that are less universal, they've had dedicated AI hardware (tensor cores) in their offerings for the past three generations. If anything, Intel is barely keeping up with its competition in this area (for the record, I see vanishingly little value in the focus on AI as a consumer, so this isn't really a ding on Intel in my books, more so making the observation from a market forces perspective)
You're not wrong that GPU and AI silicon design are tightly coupled, but my point was that both of the GPU manufacturers are dedicating hardware to AI/ML in their consumer products. Nvidia has the tensor cores in its GPUs that it justifies to consumers with DLSS and RT but we're clearly designed for AI/ML use cases when they presented them with Turing. AMD has the XDNA AI Engine that it is putting its APUs separate from its RDNA GPUs