dgerard

joined 1 year ago
MODERATOR OF
[–] [email protected] 2 points 3 months ago

oh yeah, he's lying

[–] [email protected] 3 points 3 months ago

so this is about right, and after this year's halving it's even worse

many bitcoin rigs don't hit a single block in their lifetime

so I would guess the crypto bro in this case maybe got something as part of a large pool, but i really doubt he hit big

[–] [email protected] 18 points 3 months ago (6 children)

oh yeah. no way are you going to make any appreciable money here.

[–] [email protected] 4 points 3 months ago* (last edited 3 months ago) (5 children)

i dunno, this seem to me to lead in a straight line to Chalmers claiming rocks could be conscious you can't prove they're not.

sure you can expand "computation" to things outside Turing, but then you're setting yourself up for equivocation

[–] [email protected] 6 points 3 months ago* (last edited 3 months ago)

have you considered "git"ing "gud" at posting

[–] [email protected] 18 points 3 months ago* (last edited 3 months ago)

I was particularly proud of finding that MS office worker photo, of all the MS office worker photos I've seen that one absolutely carries the most MS stench

[–] [email protected] 26 points 3 months ago (11 children)

well we're talking about data across a company. Tho apparently it does send stuff back to MS as well, because of course it does.

[–] [email protected] 5 points 3 months ago (1 children)

What really struck me was how Microsoft's big pitch for defense applications of LLMs was ... corporate slop. Just the same generic shit.

The US military has many corporate characteristics, and I'm quite sure the military has even more use cases for text that nobody wanted to write and nobody wanted to read than your average corporation. But I'd also have thought that a lying bullshit machine was an obvious bad fit for when the details matter because the enemy is trying to fucking kill you. Evidently I'm not quite up on modern military thought.

[–] [email protected] 4 points 3 months ago (1 children)

That was the current example we were thinking of, though we did look up war crimes law thinking on the subject tl;dr you risk war crimes if there isn't a human in the loop. e.g., think of a minefield as the simplest possible stationary autonomous weapon system, the rest is that with computers.

[–] [email protected] 5 points 3 months ago

tell the marketers

[–] [email protected] 6 points 3 months ago* (last edited 3 months ago)

you saw my crypto rant from 2021 right?

the problem is when you treat a pile of wires on a lab bench as a product, not a demo. Generative AI is cool demos all the way down.

[–] [email protected] 11 points 3 months ago

one of the three. The other is Friend, which is also ludicrous and we'll have to write it up.

 

you have to read down a bit, but really, I'm apparently still the Satan figure. awesome.

 

not the sort of thing you might expect to turn up in a commodities market fraud case, but then it's crypto

 

yes really, that’s literally the title of the post. (archive copy, older archive copy) LessWrong goes full Motte.

this was originally a LW front-page post, and was demoted to personal blog when it proved unpopular. it peaked at +10, dropped to -6 and is +17 right now.

but if anyone tries to make out this isn’t a normative rationalist: this guy, Michael “Valentine” Smith, is a cofounder of CFAR (the Center for Applied Rationality), a LessWrong offshoot that started being about how to do rational thinking … and finally admitted it was about “AI Risk”

this post is the Rationalist brain boys, the same guys who did FTX and Effective Altruism, going full IQ-Anon wondering how the market could fail so badly as not to care what weird disaster assholes think. this is the real Basilisk.

when they’re not spending charity money on buying themselves castles, this is what concerns the modern rationalist

several commenters answered “uh, the customers.” and tried to explain the concept of markets to OP, and how corporations like selling stuff to normal people and not just to barely-crypto-fash. they were duly downvoted to -20 by valiant culture warriors who weren’t putting up with that sort of SJW nonsense.

comment by author, who thinks “hard woke” is not only a thing, but a thing that profit-making corporations do so as not to make a profit: “For what it’s worth, I wouldn’t describe myself as leaning right.” lol ok dude

right-wingers really don’t believe in, or even understand, capitalism or markets at all. they believe in hierarchy. that’s what’s offended this dipshit.

now, you might think LessWrong Rationalists, Slate Star Codex readers, etc. tend towards behaving functionally indistinguishably from Nazis, but that’s only because they work so hard at learning from their neoreactionary comrades to reach that stage

why say in 10,000 words what you can say in 14

view more: ‹ prev next ›