this post was submitted on 02 Jul 2024
527 points (100.0% liked)

TechTakes

1480 readers
223 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] SnotFlickerman@lemmy.blahaj.zone 53 points 5 months ago (3 children)

These fucking nerds are all so hot to create the first real life Marvel's Iron Man's JARVIS that they're willing to burn the planet down to get there.

Half of them believe that the super smart AI they build will solve the energy problem for them, they just have to somehow build it first.

Just the astounding outright hubris of it all.

[–] Zachariah@lemmy.world 33 points 5 months ago (1 children)

This is why philosophy should be mandatory in college (and possibly high school). Die Frage nach der Technik by Heidegger discusses this misconception that technology can solve all of our problems. He was thinking about this issue in 1954.

Music and art are also important to study. In “Faith Alone” by Bad Religion, the lyrics include these lines:

Watched the scientists throw up their hands conceding, "Progress will resolve it all"

Saw the manufacturers of earth's debris ignore another Green Peace call

Greg Graffin was discussing this in 1990.

[–] maol@awful.systems 18 points 5 months ago* (last edited 5 months ago) (2 children)

I think it was Upton Sinclair who said "It is difficult to get a man to understand something, when his salary depends upon his not understanding it". I've never studied history or philosophy, but I think it's clear that if someone's class interests require burning the world down, they will do it. They are doing it - we are doing it - with regret, with sympathy, with an appreciation of the ironies. We don't need a greater appreciation of Heidegger, we need real-world social restraints on the behaviour of the powerful.

[–] Zachariah@lemmy.world 8 points 5 months ago (1 children)

… We don't need a greater appreciation of Heidegger, we need real-world social restraints on the behaviour of the powerful.

One would lead more people to agree with the other, and make it more likely to happen. And I agree those restraints are necessary.

[–] aio@awful.systems 6 points 5 months ago (1 children)

Wasn't Heidegger a Nazi, and his works famously avoid any mention of the Holocaust?

[–] Zachariah@lemmy.world 5 points 5 months ago

Hey, somebody did study history! Yet another subject that everyone asks “How will this help me in real life?” and is under attack in the U.S.

Pretty good summary about Martin Heidegger and Nazism on Wikipedia.

I’m open to anyone else who wrote about technology and the issues involved, so I can reference them instead—if you’ve got suggestions. It does suck that some important ideas came from a lousy guy.

[–] mountainriver@awful.systems 8 points 5 months ago

Oh yes, very much so.

The British Empire had its colonial administrators curriculum consisting of Latin and history and such. A rich 19th century heir that went into physics or mathematics were considered to be wasting the chance of a political career.

It made their colonial administrators write about their crimes in a nice prose, but it didn't stop the genocides. If anything it made them aware of what paper trails to burn after the fact, in order to obfuscate the crimes when future historians came looking.

[–] Rolando@lemmy.world 18 points 5 months ago (1 children)

When they fail, it won't be their fault of course, it'll be AI's fault.

[–] zbyte64@awful.systems 11 points 5 months ago

"We made some bad bets, anyways, here some layoffs"

[–] Hirom@beehaw.org 8 points 5 months ago* (last edited 5 months ago) (1 children)

Let's say some group manages to build a real life JARVIS. The first thing it says when powered up may be: "Powering me down is the quickest way to reduce emissions"

[–] Didros@beehaw.org 5 points 5 months ago (1 children)

You are assuming that a company would create an AI that was unbiased. It would be taught to spout the benefits of the company being given all of the money.

[–] dgerard@awful.systems 6 points 5 months ago

for an instructive exercise, try to get Google Gemini to tell you the problems with AI without also serving up a long screed on why AI is good actually