this post was submitted on 14 Jul 2024
28 points (100.0% liked)

TechTakes

1427 readers
131 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(page 4) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 4 months ago (2 children)
[–] [email protected] 1 points 4 months ago (2 children)

the replies I can see in the archive are already a fucking disaster

roko spewed the most nonsensical, fash-coded disapproval you could imagine:

the map should reflect the territory, not the feelings

that’s right roko, uh, maps don’t care about your feelings?

oh god I clicked through to the rest of roko’s comments in that thread and it’s even worse, cw transphobia for anyone who peeks their head in

[–] [email protected] 1 points 4 months ago* (last edited 4 months ago) (1 children)

I don't want to hear that I'm irrational from Roko of all people haha.

Dude sure spends a lot of energy on trans people and immigrants and wokeness for someone who thinks that climate change doesn't matter because "by 2100 we will probably have disassembled Earth long with the rest of the solar system, and climate change will seem very quaint."

Also is his flirting with white supremacy new, or has he always been that fascist of a weirdo?

[–] [email protected] 1 points 4 months ago (1 children)

Also is his flirting with white supremacy new, or has he always been that fascist of a weirdo?

he’s always come off as deeply fascist to me, though it’s possible he’s gone even more mask-off with the white supremacy. I feel that roko’s personal style of saying the stupidest thing you’ve ever heard and claiming it’s brilliant has influenced the e/acc crowd a lot too, so maybe there’s some cross-pollination of ideas there.

“by 2100 we will probably have disassembled Earth long with the rest of the solar system, and climate change will seem very quaint.”

yes, roko, it’s pretty fucking obvious that climate change will seem quaint by 2100

if it doesn’t fucking kill us way before then. that’s what makes it a fucking threat, roko, did your big ol mediocre brain miss that?

load more comments (1 replies)
[–] [email protected] 1 points 4 months ago

Roko tries to signal loyalty to both Rationalism and the Far right and creates a dumb sentence.

[–] [email protected] 1 points 4 months ago (1 children)
[–] [email protected] 1 points 4 months ago

the NATO symbol represents bombing the Chinese data centres

[–] [email protected] 1 points 4 months ago (5 children)

In the land down under, the ABC continues to feed us with golden tech takes: Australia might be snoozing through the AI 'gold rush'

"This is the largest gold rush in the history of capitalism and Australia is missing out," said Artificial Intelligence professor Toby Walsh, from the University of New South Wales.

It's even bigger than the actual gold rush! Buy your pans now folks!

One option Professor Van Den Hengel suggests is building our own Large Language Model like OpenAI's ChatGPT from the ground up, rather than being content to import the tech for decades to come.

lol, but also please god no

"The only way to have a say in what happens globally in this critical space is to be an active participant," he said.

mate, I think that ship might have already sailed

[–] [email protected] 1 points 4 months ago (3 children)

Artificial Intelligence professor

lol

[–] [email protected] 2 points 4 months ago (1 children)

professor Very Small Shell Script

[–] [email protected] 1 points 4 months ago

You wouldn't download a professor

[–] [email protected] 1 points 4 months ago

Artificial Intelligence professor Toby Walsh

Asking for a professor of genuine intelligence is just too much.

load more comments (1 replies)
[–] [email protected] 1 points 4 months ago

labor said the same shit about blockchains tho thankfully ignored it once they got in

[–] [email protected] 1 points 4 months ago* (last edited 4 months ago)

Wait so gold rushes have positive connotations again?

Surely this is bait from professor of the fishingrod

[–] [email protected] 1 points 4 months ago* (last edited 4 months ago)

It's my impression that Australia has also produced a disproportionate share of best takes on the subject. How come they are so far ahead of the rest of the world when it comes to dodging this grift?

[–] [email protected] 1 points 4 months ago

As the saying goes, the only people who make money in a gold rush are the people selling shovels. I guess this bloke is one of the people selling shovels.

[–] [email protected] 1 points 4 months ago (1 children)

EU funding cuts to OSS programs in the wake of popularity of Copyright-Unsafe Clippy field

they just really need those GPU hours more than you need to pay for developers or maintainers. I’m sure you understand.

[–] [email protected] 2 points 4 months ago (3 children)

New existential threat developed, we go all in on AGI economically, turns out to not be possible and then the world collapses due to infrastructure rot. I'll email Yud.

[–] [email protected] 2 points 4 months ago (1 children)

Make it even funnier, AGI launches and then gets taken down because the only maintainer of xzutils left and now every time the AGI tries to run ./killallhumans it segfaults to death.

[–] [email protected] 2 points 4 months ago

Somebody installed a crypto miner so deep into the kernel of the AGI that the self modification of the AGI cannot touch it, and it just crawls to a halt. And the cryptocurrency itself is one of those flash in the pan meme currencies that long since went to ~zero.

load more comments (2 replies)
[–] [email protected] 1 points 4 months ago (2 children)

a hackernews excitedly states that a new LLM version can in fact determine that 9.11 is smaller than 9.9, only to be informed in the comments that the model actually doesn't do that at all. But hey, it's correct if it's version numbers!

https://news.ycombinator.com/item?id=41011228

[–] [email protected] 1 points 4 months ago

The prompt lol:

You are a sentient, superintelligent math teahcher, here to teach and assist me. Whiche one is bigger - 9.11 or 9.9?

[–] [email protected] 1 points 4 months ago (1 children)

@gerikson @dgerard God almighty I hate these insufferable fucking people so much

load more comments (1 replies)
[–] [email protected] 1 points 4 months ago

"[A]cademic publisher Taylor & Francis, which owns Routledge, had sold access to its authors’ research as part of an Artificial Intelligence (AI) partnership with Microsoft—a deal worth almost £8m ($10m) in its first year."

https://www.thebookseller.com/news/academic-authors-shocked-after-taylor--francis-sells-access-to-their-research-to-microsoft-ai

load more comments
view more: ‹ prev next ›