this post was submitted on 30 Dec 2024
1547 points (97.1% liked)

memes

10944 readers
3858 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 11 points 2 weeks ago (1 children)

But the companies must posture that their on the cutting edge! Even if they only put the letters "AI" on the box of a rice cooker without changing the rice cooker

[–] [email protected] 6 points 2 weeks ago

When it comes to the marketing teams in such companies, I wonder what the ratio is between true believers and "this is stupid but if it spikes the numbers next quarter that will benefit me.”

[–] [email protected] 10 points 2 weeks ago (4 children)

According to some meme I saw, it's gonna fuck your wife in 2025.

load more comments (4 replies)
[–] [email protected] 10 points 2 weeks ago

"AI" isn't ready for any type of general consumer market and that's painfully obvious to anyone even remotely aware of how it's developing, including investors.

...but the cost benefit analysis on being first-to-market with anything even remotely close to the universal applicability of AI is so absolutely insanely on the "benefit" side that it's essentially worth any conceivable risk, because the benefit if you get it right is essentially infinite.

It won't ever stop

[–] [email protected] 10 points 2 weeks ago

One of the leading sources of enshitification.

[–] [email protected] 9 points 2 weeks ago (7 children)

I hate what AI has become and is being used for, i strongly believe that it could have been used way more ethically, solid example being Perplexity, it shows you the sources being used at the top, being the first thing you see when it give a response. The opposite of this is everything else. Even Gemini, despite it being rather useful in day to day life when I need a quick answer to something when I'm not in the position to hold my phone, like driving, doing dishes, or yard work with my ear buds in

load more comments (7 replies)
[–] [email protected] 8 points 2 weeks ago (5 children)

Forcing AI into everything maximizes efficiency, automates repetitive tasks, and unlocks insights from vast data sets that humans can't process as effectively. It enhances personalization in services, driving innovation and improving user experiences across industries. However, thoughtful integration is critical to avoid ethical pitfalls, maintain human oversight, and ensure meaningful, responsible use of AI.

load more comments (5 replies)
[–] [email protected] 7 points 2 weeks ago (1 children)

Its not really targeting the consumers here, its just to impress investors with it

load more comments (1 replies)
[–] [email protected] 7 points 2 weeks ago
[–] [email protected] 6 points 2 weeks ago

Please tell this my simulation group members. I told them, but they won't listen.

[–] [email protected] 6 points 2 weeks ago
[–] [email protected] 6 points 2 weeks ago* (last edited 2 weeks ago)

Plot twist: this image was generated by AI /j

[–] [email protected] 6 points 2 weeks ago

The Imperium of Man got this right.

[–] [email protected] 6 points 2 weeks ago (1 children)

yes, we need to ask for AI's consent first!

[–] [email protected] 5 points 2 weeks ago

oh wait, i think i read it wrong

[–] [email protected] 5 points 2 weeks ago

I don't blame them, they have to compensate their organic intelligence somehow.

[–] [email protected] 5 points 2 weeks ago (2 children)

I like this typography, but I don't like somebody pretending they represent everybody.

load more comments (2 replies)
load more comments
view more: ‹ prev next ›