this post was submitted on 17 Jul 2024
401 points (99.5% liked)

PC Gaming

8427 readers
1032 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 112 points 3 months ago (3 children)

I’d pay extra for no AI in any of my shit.

[–] [email protected] 60 points 3 months ago (13 children)

I would already like to buy a 4k TV that isn't smart and have yet to find it. Please don't add AI into the mix as well :(

[–] [email protected] 19 points 3 months ago (2 children)

Look into commercial displays

[–] [email protected] 17 points 3 months ago (3 children)

The simple trick to turn a "smart" TV into a regular one is too cut off its internet access.

[–] [email protected] 16 points 3 months ago (1 children)

Except it will still run like shit and may send telemetry via other means to your neighbors same brand TV

[–] [email protected] 6 points 3 months ago (2 children)

I've never heard of that. Do you have a source on that? And how would it run like shit if you're using something like a Chromecast?

load more comments (2 replies)
[–] [email protected] 10 points 3 months ago

Mine still takes several seconds to boot android TV just so it can display the HDMI input, even if not connected to internet. It has to be always plugged on the power because if there is a power cut, it needs to boot android TV again.

My old dumb TV did that in a second without booting an entire OS. Next time I need a big screen, it will be a computer monitor.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 9 points 3 months ago (1 children)

I was just thinking the other day how I'd love to "root" my TV like I used to root my phones. Maybe install some free OS instead

load more comments (1 replies)
load more comments (11 replies)
load more comments (2 replies)
[–] [email protected] 36 points 3 months ago (17 children)

I would pay for AI-enhanced hardware...but I haven't yet seen anything that AI is enhancing, just an emerging product being tacked on to everything they can for an added premium.

[–] [email protected] 15 points 3 months ago

In the 2010s, it was cramming a phone app and wifi into things to try to justify the higher price, while also spying on users in new ways. The device may even a screen for basically no reason.
In the 2020s, those same useless features now with a bit of software with a flashy name that removes even more control from the user, and allows the manufacturer to spy on even further the user.

[–] [email protected] 9 points 3 months ago

It's like rgb all over again.

At least rgb didn't make a giant stock market bubble...

[–] [email protected] 7 points 3 months ago

Anything AI actually enhanced would be advertising the enhancement not the AI part.

[–] [email protected] 6 points 3 months ago

My Samsung A71 has had devil AI since day one. You know that feature where you can mostly use fingerprint unlock but then once a day or so it ask for the actual passcode for added security. My A71 AI has 100% success rate of picking the most inconvenient time to ask for the passcode instead of letting me do my thing.

load more comments (13 replies)
[–] [email protected] 33 points 3 months ago (1 children)

I am generally unwilling to pay extra for features I don't need and didn't ask for.

load more comments (1 replies)
[–] [email protected] 29 points 3 months ago (4 children)

We're not gonna make it, are we? People, I mean.

load more comments (4 replies)
[–] [email protected] 29 points 3 months ago (7 children)

The dedicated TPM chip is already being used for side-channel attacks. A new processor running arbitrary code would be a black hat's wet dream.

[–] [email protected] 22 points 3 months ago (1 children)

It will be.

IoT devices are already getting owned at staggering rates. Adding a learning model that currently cannot be secured is absolutely going to happen, and going to cause a whole new large batch of breaches.

[–] [email protected] 22 points 3 months ago

The “s” in IoT stands for “security”

load more comments (6 replies)
[–] [email protected] 20 points 3 months ago (2 children)

Only 7% say they would pay more, which to my mind is the percentage of respondents who have no idea what "AI" in its current bullshit context even is

[–] [email protected] 8 points 3 months ago (2 children)

Or they know a guy named Al and got confused. ;)

[–] [email protected] 7 points 3 months ago (1 children)

Maybe I'm in the minority here, but I'd gladly pay more for Weird Al enhanced hardware.

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 19 points 3 months ago (3 children)

What does AI enhanced hardware mean? Because I bought an Nvidia RTX card pretty much just for the AI enhanced DLSS, and I’d do it again.

[–] [email protected] 28 points 3 months ago

When they start calling everything AI, soon enough it loses all meaning. They're gonna have to start marketing things as AI-z, AI 2, iAI, AIA, AI 360, AyyyAye, etc. Got their work cut out for em, that's for sure.

load more comments (2 replies)
[–] [email protected] 17 points 3 months ago (2 children)

Who in the heck are the 16%

[–] [email protected] 13 points 3 months ago (2 children)
  • The ones who have investments in AI

  • The ones who listen to the marketing

  • The ones who are big Weird Al fans

  • The ones who didn't understand the question

[–] [email protected] 6 points 3 months ago

I would pay for Weird-Al enhanced PC hardware.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 14 points 3 months ago (1 children)

My old ass GTX 1060 runs some of the open source language models. I imagine the more recent cards would handle them easily.

What’s the “AI” hardware supposed to do that any gamer with recent hardware can’t?

load more comments (1 replies)
[–] [email protected] 14 points 3 months ago (1 children)

I was recently looking for a new laptop and I actively avoided laptops with AI features.

[–] [email protected] 10 points 3 months ago

Look, me too, but, the average punter on the street just looks at AI new features and goes OK sure give it to me. Tell them about the dodgy shit that goes with AI and you'll probably get a shrug at most

[–] [email protected] 13 points 3 months ago* (last edited 3 months ago) (4 children)

And when traditional AI programs can be run on much lower end hardware with the same speed and quality, those chips will have no use. (Spoiler alert, it's happening right now.)

Corporations, for some reason, can't fathom why people wouldn't want to pay hundreds of dollars more just for a chip that can run AI models they won't need most of the time.

If I want to use an AI model, I will, but if you keep developing shitty features that nobody wants using it, just because "AI = new & innovative," then I have no incentive to use it. LLMs are useful to me sometimes, but an LLM that tries to summarize the activity on my computer isn't that useful to me, so I'm not going to pay extra for a chip that I won't even use for that purpose.

load more comments (4 replies)
[–] [email protected] 12 points 3 months ago (1 children)

Fuck, they won't upgrade to TPM for windows 11

load more comments (1 replies)
[–] [email protected] 11 points 3 months ago

I can't tell how good any of this stuff is because none of the language they're using to describe performance makes sense in comparison with running AI models on a GPU. How big a model can this stuff run, how does it compare to the graphics cards people use for AI now?

[–] [email protected] 11 points 3 months ago

I'm willing to pay extra for software that isn't

[–] [email protected] 9 points 3 months ago

I don't think the poll question was well made... "would you like part away from your money for..." vaguely shakes hand in air "...ai?"

People is already paying for "ai" even before chatGPT came out to popularize things: DLSS

[–] [email protected] 9 points 3 months ago (2 children)

Okay, but here me out. What if the OS got way worse, and then I told you that paying me for the AI feature would restore it to a near-baseline level of original performance? What then, eh?

load more comments (2 replies)
[–] [email protected] 8 points 3 months ago (1 children)

Let me put it in lamens terms..... FUCK AI.... Thanks, have a great day

[–] [email protected] 8 points 3 months ago (1 children)

FYI the term is "layman's", as of you were using the language of a layman, or someone who is not specifically experienced in the topic.

[–] [email protected] 7 points 3 months ago (1 children)

Sounds like something a lameman would say

load more comments (1 replies)
[–] [email protected] 8 points 3 months ago

Even DLSS only works great for some types of games.

Although there have been some clever uses of it, lots of games could gain a lot from proper efficiency of the game engine.

War Thunder runs like total crap on even the highest end hardware, yet World of Warships has much more detailed ships and textures running fine off an HDD and older than GTX 7XX graphics.

Meanwhile on Linux, Compiz still runs crazy window effects and 3D cube desktop much better and faster than KDE. It's so good I even recommend it for old devices with any kid of gpu because the hardware acceleration will make your desktop fast and responsive compared to even the lightest windows managers like openbox.

TF2 went from 32 bit to 64 bit and had immediate gains in performance upwards of 50% and almost entirely removing stuttering issues from the game.

Batman Arkham Knight ran on a heavily modified version of Unreal 3 which was insane for the time.

Most modern games and applications really don't need the latest and greatest hardware, they just need to be efficiently programmed which is sometimes almost an art itself. Slapping on "AI" to reduce the work is sort of a lazy solution that will have side effects because you're effectively predicting the output.

[–] [email protected] 7 points 3 months ago* (last edited 3 months ago) (5 children)

I would pay extra to be able to run open LLM's locally on Linux. I wouldn't pay for Microsoft's Copilot stuff that's shoehorned into every interface imaginable while also causing privacy and security issues. The context matters.

load more comments (5 replies)
[–] [email protected] 7 points 3 months ago

When a decent gpu is ~$1k alone, then someone wants you to pay more $ for a feature that offers no tangible benefit, why the hell would they want it? I haven’t bought a PC for over 25 years, I build my own and for family and friends. I’m building another next week for family, and AI isn’t even on the radar. If anything, this one is going to be anti-AI and get a Linux dual-boot as well as sticking with Win10, no way am I subjecting family to that Win11 adware.

[–] [email protected] 7 points 3 months ago (1 children)

The other 26% were bots answering.

[–] [email protected] 12 points 3 months ago
[–] [email protected] 7 points 3 months ago

A big letdown for me is, except with some rare cases, those extra AI features useless outside of AI. Some NPUs are straight out DSPs, they could easily run OpenCL code, others are either designed to not be able to handle any normal floating point numbers but only ones designed for machine learning, or CPU extensions that are just even bigger vector multipliers for select datatypes (AMX).

[–] [email protected] 6 points 3 months ago (1 children)

The biggest surprise here is that as many as 16% are willing to pay more...

load more comments (1 replies)
[–] [email protected] 5 points 3 months ago* (last edited 3 months ago) (1 children)

I'm fine with NPUs / TPUs (AI-enhancing hardware) being included with systems because it's useful for more than just OS shenanigans and commercial generative AI. Do I want Microsoft CoPilot Recall running on that hardware? No.

However I've bought TPUs for things like Frigate servers and various ML projects. For gamers there's some really cool use cases out there for using local LLMs to generate NPC responses in RPGs. For "Smart Home" enthusiasts things like Home Assistant will be rolling out support for local LLMs later this year to make voice commands more context aware.

So do I want that hardware in there so I can use it MYSELF for other things? Yes, yes I do. You probably will eventually too.

load more comments (1 replies)
load more comments
view more: next ›