this post was submitted on 18 Jul 2024
401 points (99.3% liked)

Technology

59598 readers
3536 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn't know, while just under 2,000 voters said yes.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 71 points 4 months ago (6 children)

someone tried to sell me a fucking AI fridge the other day. Why the fuck would I want my fridge to "learn my habits?" I don't even like my phone "learning my habits!"

[–] [email protected] 17 points 4 months ago (3 children)

Why does a fridge need to know your habits?

It has to keep the food cold all the time. The light has to come on when you open the door.

What could it possibly be learning

[–] [email protected] 19 points 4 months ago (1 children)

Hi Zron, you seem to really enjoy eating shredded cheese at 2:00am! For your convenience, we’ve placed an order for 50lbs of shredded cheese based on your rate of consumption. Thanks!

[–] [email protected] 7 points 4 months ago

We also took the liberty of canceling your health insurance to help protect the shareholders from your abhorrent health expenses in the far future

[–] [email protected] 4 points 4 months ago

So I can see what you like to eat, then it can tell your grocery store, then your grocery store can raise the prices on those items. That's the point. It's the same thing with those memberships and coupon apps. That's the end goal.

load more comments (1 replies)
[–] [email protected] 9 points 4 months ago

And it would improve your life zero. That is what is absurd about LLM’s in their current iteration, they provide almost no benefit to a vast majority of people.

All a learning model would do for a fridge is send you advertisements for whatever garbage food is on sale. Could it make recipes based on what you have? Tell it you want to slowly get healthier and have it assist with grocery selection?

Nah, fuck you and buy stuff.

load more comments (4 replies)
[–] [email protected] 70 points 4 months ago (1 children)

...just under 2,000 voters said "yes."

And those people probably work in some area related to LLMs.

It's practically a meme at this point:

Nobody:

Chip makers: People want us to add AI to our chips!

[–] [email protected] 18 points 4 months ago (1 children)

The even crazier part to me is some chip makers we were working with pulled out of guaranteed projects with reasonably decent revenue to chase AI instead

We had to redesign our boards and they paid us the penalties in our contract for not delivering so they could put more of their fab time towards AI

load more comments (1 replies)
[–] [email protected] 41 points 4 months ago* (last edited 4 months ago) (3 children)

This is one of those weird things that venture capital does sometimes.

VC is is injecting cash into tech right now at obscene levels because they think that AI is going to be hugely profitable in the near future.

The tech industry is happily taking that money and using it to develop what they can, but it turns out the majority of the public don't really want the tool if it means they have to pay extra for it. Especially in its current state, where the information it spits out is far from reliable.

[–] [email protected] 21 points 4 months ago (1 children)

I don't want it outside of heavily sandboxed and limited scope applications. I dont get why people want an agent of chaos fucking with all their files and systems they've cobbled together

[–] [email protected] 6 points 4 months ago

NDA also legally prevent you from using this forced garbage too. Companies are going to get screwed over by other companies, capitalism is gonna implode hopefully

[–] [email protected] 12 points 4 months ago

I have to endure a meeting at my company next week to come up with ideas on how we can wedge AI into our products because the dumbass venture capitalist firm that owns our company wants it. I have been opting not to turn on video because I don’t think I can control the cringe responses on my face.

[–] [email protected] 8 points 4 months ago (2 children)

Back in the 90s in college I took a Technology course, which discussed how technology has historically developed, why some things are adopted and other seemingly good ideas don't make it.

One of the things that is required for a technology to succeed is public acceptance. That is why AI is doomed.

load more comments (2 replies)
[–] [email protected] 39 points 4 months ago (4 children)

There's really no point unless you work in specific fields that benefit from AI.

Meanwhile every large corpo tries to shove AI into every possible place they can. They'd introduce ChatGPT to your toilet seat if they could

[–] [email protected] 19 points 4 months ago (2 children)

"Shits are frequently classified into three basic types..." and then gives 5 paragraphs of bland guff

[–] [email protected] 19 points 4 months ago

With how much scraping of reddit they do, there's no way it doesn't try ordering a poop knife off of Amazon for you.

load more comments (1 replies)
[–] [email protected] 11 points 4 months ago (5 children)

Imagining a chatgpt toilet seat made me feel uncomfortable

[–] [email protected] 4 points 4 months ago (1 children)

Someone did a demo recently of AI acceleration for 3d upscaling (think DLSS/AMDs equivilent) and it showed a nice boost in performance. It could be useful in the future.

I think it's kind of a ray tracing. We don't have a real use for it now, but eventually someone will figure out something that it's actually good for and use it.

[–] [email protected] 3 points 4 months ago* (last edited 4 months ago) (4 children)

AI acceleration for 3d upscaling

Isn't that not only similar to, but exactly what DLSS already is? A neural network that upscales games?

load more comments (4 replies)
load more comments (1 replies)
[–] [email protected] 30 points 4 months ago (2 children)

And what do the companies take away from this? "Cool, we just won't leave you any other options."

load more comments (2 replies)
[–] [email protected] 28 points 4 months ago

I don't mind the hardware. It can be useful.

What I do mind is the software running on my PC sending all my personal information and screenshots and keystrokes to a corporation that will use all of it for profit to build user profile to send targeted advertisement and can potentially be used against me.

[–] [email protected] 14 points 4 months ago (1 children)
load more comments (1 replies)
[–] [email protected] 12 points 4 months ago (1 children)

No, but I would pay good money for a freely programmable FPGA coprocessor.

If the AI chip is implemented as one, and is useful for other things I'm sold.

[–] [email protected] 3 points 4 months ago (2 children)

I think manufacturers need to get a lot more creative about simplified computing. The RPi Pico's GPIO engine is powerful yet simple, and a good example of what is possible with some good application analysis and forethought.

load more comments (2 replies)
[–] [email protected] 10 points 4 months ago

84% said no.

16% punched the person asking them for suggesting such a practice. So they also said no. With their fist.

[–] [email protected] 10 points 4 months ago

Any “ai” hardware you but today will be obsolete so fast it will make your dick bleed

[–] [email protected] 10 points 4 months ago (1 children)

Its bad enough they shove it on you in some websites. Really not interested in being their lab rats

load more comments (1 replies)
[–] [email protected] 9 points 4 months ago (2 children)

I honestly have no Idea what AI does to a processor, and would therefore not pay extra for the badge.

If it provided a significant speed improvement or something, then yeah, sure. Nobody has really communicated to me what the benefit is. It all seems like hand waving.

[–] [email protected] 10 points 4 months ago

what they mean is that they are putting in dedicated processors or other hardware just to run an LLM . it doesnt speed up anything other than the faux-AI tool they are implementing.

LLMs require a ton of math that is better suited to video processors than the general purpose cpu on most machines.

load more comments (1 replies)
[–] [email protected] 9 points 4 months ago (2 children)

They want you to buy the hardware and pay for the additional energy costs so they can deliver clippy 2.0, the watching-you-wank-edition.

load more comments (2 replies)
[–] [email protected] 9 points 4 months ago* (last edited 4 months ago) (1 children)

That's kind of abstract. Like, nobody pays purely for hardware. They pay for the ability to run software.

The real question is, would you pay $N to run software package X?

Like, go back to 2000. If I say "would you pay $N for a parallel matrix math processing card", most people are going to say "no". If I say "would you pay $N to play Quake 2 at resolution X and fps Y and with nice smooth textures," then it's another story.

I paid $1k for a fast GPU so that I could run Stable Diffusion quickly. If you asked me "would you pay $1k for an AI-processing card" and I had no idea what software would use it, I'd probably say "no" too.

[–] [email protected] 5 points 4 months ago (1 children)

Yup, the answer is going to change real fast when the next Oblivion with NPCs you can talk to needs this kind of hardware to run.

[–] [email protected] 3 points 4 months ago* (last edited 4 months ago)

I'm still not sold that dynamic text generation is going to be the major near-term application for LLMs, much less in games. Like, don't get me wrong, it's impressive what they've done. But I've also found it to be the least-practically-useful of the LLM model categories. Like, you can make real, honest-to-God solid usable graphics with Stable Diffusion. You can do pretty impressive speech generation in TortoiseTTS. I imagine that someone will make a locally-runnable music LLM model and software at some point if they haven't yet; I'm pretty impressed with what the online services do there. I think that there are a lot of neat applications for image recognition; the other day I wanted to identify a tree and seedpod. Someone hasn't built software to do that yet (that I'm aware of), but I'm sure that they will; the ability to map images back to text is pretty impressive. I'm also amazed by the AI image upscaling that Stable Diffusion can do, and I suspect that there's still room for a lot of improvement there, as that's not the main goal of Stable Diffusion. And once someone has done a good job of building a bunch of annotated 3d models, I think that there's a whole new world of 3d.

I will bet that before we see that becoming the norm in games, we'll see LLMs regularly used for either pre-generated speech synth or in-game speech synthesis, so that the characters say text (which might be procedurally-generated, aren't just static pre-recorded samples, but aren't necessarily generated from an LLM). Like, it's not practical to have a human voice actor cover all possible phrases with static recorded speech that one might want an in-game character to speak.

[–] [email protected] 8 points 4 months ago* (last edited 4 months ago)

I agree that we shouldn't jump immediately to AI-enhancing it all. However, this survey is riddled with problems, from selection bias to external validity. Heck, even internal validity is a problem here! How does the survey account for social desirability bias, sunk cost fallacy, and anchoring bias? I'm so sorry if this sounds brutal or unfair, but I just hope to see less validity threats. I think I'd be less frustrated if the title could be something like "TechPowerUp survey shows 84% of 22,000 respondents don't want AI-enhanced hardware".

[–] [email protected] 8 points 4 months ago (3 children)

Show me a practical use for AI and I'll show you the money. Genmoji ain't it.

Give me a virtual assistant that actually functions and I will give you A LOT of money...

load more comments (2 replies)
[–] [email protected] 6 points 4 months ago (1 children)

30% of people will believe literally anything. 16% means even half of the deranged people aren't interested.

load more comments (1 replies)
[–] [email protected] 6 points 4 months ago (1 children)

Most people won't pay for it because a lot of AI stuff is done cloud side. Even stuff that could be done locally is done in the cloud a lot. If that wasn't possible, probably more people would wand the hardware. It makes more sense for corporations to invest in hardware.

load more comments (1 replies)
[–] [email protected] 6 points 4 months ago (1 children)

AI in Movies: "The only Logical solution, is the complete control/eradication of humanity."

AI in Real Life: "Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices." Dave: "THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!"

[–] [email protected] 5 points 4 months ago

Personally I would choose a processor with AI capabilities over a processor without, but I would not pay more for it

[–] [email protected] 5 points 4 months ago (2 children)

Most people have pretty decent ai hardware already in the form of a gpu.

Sure dedicated hardware might be more efficient for mobile devices, but that's already done better in the cloud.

[–] [email protected] 3 points 4 months ago

It's not really done better in the cloud if you can push the compute out to the device. When you can leverage edge hardware you save bandwidth fees and a ton of cloud costs. It's faster in the cloud because you can leverage a cluster with economies of scale, but any AI company would prefer the end-user to pay for that compute instead, if they can service requests adequately.

load more comments (1 replies)
[–] [email protected] 3 points 4 months ago
load more comments
view more: next ›