V0ldek

joined 11 months ago
[–] V0ldek@awful.systems 2 points 9 hours ago

Also I'm sorry but

Why the discrepancy? A footnote in the CE Delft report makes it clear: the price figures for macronutrients are largely based on a specific amino acid protein powder that sells for $400 a ton on the sprawling e-commerce marketplace Alibaba.com.

this is exactly the sort of magical thinking I'm talking about "it will scale because we can order tons of the stuff off Alibaba" just what the fuck are you smoing mate, this can't be good faith analysis

[–] V0ldek@awful.systems 4 points 2 days ago

A lot of “I can control my emotions and choose how I act, you should try that” - yeah stop. We’re human. Emotions are normal.

Ye, that's the point? The point is not to suppress emotions but to recognise them as they're happening to you. It's not even that there's objective value assigned to the emotions, it's simply so that you yourself can perform introspection of the kind "I did that action because I was furious. Now is that good or bad?". But it's still entirely okay to make a conscious decision of the form:

  1. I'm gonna punch that motherfucker
  2. Okay, stop, I am feeling fury right now, I shouldn't allow just the emotion to guide me. Let's think.
  3. Okay, I thought this through, I'm gonna punch that motherfucker with purpose.
[–] V0ldek@awful.systems 4 points 2 days ago* (last edited 2 days ago)

Step one is understanding you only control your own thoughts and actions. Step two is learning how to control your anger and use it as fuel for deliberate actions.

Honestly, I think Luigi here just followed this wisdom. Recognised that he was rightfully angry at the system and directed that anger at someone responsible. You only control your actions, and your action can be to shoot a motherfucker on the street ¯\_(ツ)_/¯

I'm not condoning or saying it's morally acceptable, but I don't think it's philosophically incoherent.

[–] V0ldek@awful.systems 7 points 2 days ago (2 children)

I think it forces us all to ask an important introspective question -- if I were to become the target of a national manhunt, would my posting history look cringe?

[–] V0ldek@awful.systems 4 points 2 days ago

Realistic version: pulling the lever would save five lives but that decision would cost shareholders $7.23. What should you do?

10/10 CEOs fail this test!

[–] V0ldek@awful.systems 4 points 2 days ago* (last edited 2 days ago) (1 children)

Very good read, but throughout I can't help but say to myself "ye so the issue is scale. AS ALWAYS"

This is a tale as old as time. Fusion energy is here! Quantum computers will revolutionise the world! Lab-grown meat! All based on actual scientific experiments and progress, but tiny, one-shot experiments under best-case conditions. There is no reason to think it brings us closer to a future where those are commonplace, except for a very nebulous technical meaning of "closer" as "yes, time has passed". There is no reason to think this would ever scale in any way! Like, there is a chance that e.g. fusion energy at any meaningful scale is just... impossible? Like, physically impossible to do. Or a stable quantum computer able to run Doom. Or lab-grown meat on a supermarket shelf. Every software engineer should understand this, we know there are ideas that work only when they're in a limited setting (number of threads, connections, size of input, whatever).

The media is always terrible at communicating this. Science isn't fucking magic, the fact that scientists were able to put one more qubit into their quantum computer means literally nothing to you, because the answer to "when will we have personal quantum computers" is "what? how did you get into my lab?". We have no idea. 50 years? 100 years? 1000 years? Likely never? Which number can I pull out of my ass for you to fuck off and let me do my research in peace? Of course, science is amazing, reading about those experiments is extremely interesting and cool as all fuck, but for some fucking reason the immediate reaction of the general public is "great, how quickly can we put a pricemark on it".

And this leads to this zeitgeist where the next great "breakthrough" is just around the corner and is going to save us all. AI will fix the job market! Carbon capture will fix climate change! Terraforming Mars will solve everything! Sit the fuck down and grow up, this is not how anything works. I don't even know where this idea of "breakthroughs" comes from, the scientific process isn't an action movie with three acts and a climax, who told you that? What even was the last technological "breakthrough"? Transistors were invented like 70yrs ago, but it wasn't an immediate breakthrough, it required like 40yrs of work on improving vacuum tubes to get there. And that was based on a shitton of work on electric theory from the XIX century. It was a slow process of incremental scientific discoveries across nations and people, which culminated in you having an iPhone 200 years later. And that's at least based on something we can actually easily observe in the natural world (and, funnily enough, we still don't have a comprehensive theory of how lightning storms even form on Earth). With fusion you're talking about replicating the heart of a star here on Earth, with lab grown meat you're talking about growing flesh in defiance of gods, and you think it's an overnight thing where you'll wake up tomorrow and suddenly bam we just have cold fusion and hot artificial chicken?

I hate how everyone seems to be addicted to, I don't know, just speed as a concept? Things have to be now, news is only good if it arrives to me breaking in 5 minutes, science is only good if it's just around the corner, a product is only good if it gets one billion users in a month. Just calm the fuck down. When was the last time you smelt the roses?

If you keep running through life all the roses are gonna burn down before you realise.

[–] V0ldek@awful.systems 3 points 2 days ago

Salvation Army

they are certainly mostly doing worthwhile things

No. Nope. Not in the slightest. Crucially, they're not even a charity! They don't get any financial transparency scrutiny a charity gets! It's a church! We don't even know how to evaluate them because there's literally no way to check what percentage of it is actually spent on charity. Their primary mission is to evangelise!

Also Chick'fil'A had to distance themselves from SA because of their egregious track record with gay rights. The Bigotry Chicken deemed them too bigoted.

[–] V0ldek@awful.systems 3 points 2 days ago

Will Microsoft fully buy them out?

Yup. They own basically everything anyway, they take the tech, poach the people, lay off 80% of them, and then continue selling copilot in Office 2137 Pro Enterprise Whatever until the end of time

[–] V0ldek@awful.systems 4 points 2 days ago

Satelite models are increasingly trained and deployed as autonomous agents, which significantly increases their potential for risks. One particular safety concern is that the Moon might covertly pursue misaligned goals, hiding its true capabilities and objectives – also known as scheming. We study whether the Moon has the capability to scheme in pursuit of a goal that we provide in-context and instruct the Moon to strongly follow. We evaluate satelite models on a suite of six planetary evaluations where the Moon is instructed to pursue goals and is placed in orbits that incentivize scheming.

[–] V0ldek@awful.systems 3 points 2 days ago

don’t think I’ve ever heard someone I agree with being so unpleasant to listen to

Sending this to EZ so that he can put it as his by-line

[–] V0ldek@awful.systems 2 points 1 week ago

Ed is presuming a high school education from his readers

Hmm, I don't think ignoring the American audience like that is a good idea, but maybe he has his reasons

[–] V0ldek@awful.systems 3 points 2 weeks ago

Funnily enough he makes a really strong case as to why he specifically definitely shouldn't be a father.

If you asked an embryo to pick parents it'd be like "oof, anyone but that guy please"

 

An excellent post by Ludicity as per usual, but I need to vent two things.

First of all, I only ever worked in a Scrum team once and it was really nice. I liked having a Product Owner that was invested in the process and did customer communications, I loved having a Scrum Master that kept the meetings tight and followed up on Retrospective points, it worked like a well-oiled machine. Turns out it was a one-of-a-kind experience. I can't imagine having a stand-up for one hour without casualties involved.

A few months back a colleague (we're both PhD students at TU Munich) was taking a piss about how you can enroll in a Scrum course as an elective for our doctor school. He was in general making fun of the methodology but using words I've never heard before in my life. "Agile Testing". "Backlog Grooming". "Scrum of Scrums". I was like "dude, none of those words are in the bible", went to the Scrum Guide (which as far as I understood was the only document that actually defined what "Scrum" meant) and Ctrl+F-ed my point of literally none of that shit being there. Really, where the fuck does any of that come from? Is there a DLC to Scrum that I was never shown before? Was the person who first uttered "Scrumban" already drawn and quartered or is justice yet to be served?

Aside: the funniest part of that discussion was that our doctor school has an exemption that carves out "credits for Scrum and Agile methodology courses" as being worthless towards your PhD, so at least someone sane is managing that.

Second point I wanted to make was that I was having a perfectly happy holiday and then I read the phrase "Agile 2" and now I am crying into an ice-cream bucket. God help us all. Why. Ludicity you fucking monster, there was a non-zero chance I would've gone through my entire life without knowing that existed, I hate you now.

 

Turns out software engineering cannot be easily solved with a ~~small shell script~~ large language model.

The author of the article appears to be a genuine ML engineer, although some of his takes aged like fine milk. He seems to be shilling Google a bit too much for my taste. However, the sneer content is good nonetheless.

First off, the "Devin solves a task on Upwork" demo is 1. cherry picked, 2. not even correctly solved.

Second, and this is the absolutely fantastic golden nugget here, to show off its "bug solving capability" it creates its own nonsensical bugs and then reverses them. It's the ideal corporate worker, able to appear busy by creating useless work for itself out of thin air.

It also takes over 6 hours to perform this task, which would be reasonable for an experienced software engineer, but an experienced software engineer's workflow doesn't include burning a small nuclear explosion worth of energy while coding and then not actually solving the task. We don't drink that much coffee.

The next demo is a bait-and-switch again. In this case I think the author of the article fails to sneer quite as much as it's worthy -- the task the AI solves is writing test cases for finding the Least Common Multiple modulo a number. Come on, that task is fucking trivial, all those tests are oneliners! It's famously much easier to verify modulo arithmetic than it is to actually compute it. And it takes the AI an hour to do it!

It is a bit refreshing though that it didn't turn out DEVIN is just Dinesh, Eesha, Vikram, Ishani, and Niranjan working for $2/h from a slum in India.

 

I'm not sure if this fully fits into TechTakes mission statement, but "CEO thinks it's a-okay to abuse certificate trust to sell data to advertisers" is, in my opinion, a great snapshot of what brain worms live inside those people's heads.

In short, Facebook wiretapped Snapchat by sending data through their VPN company, Onavo. Installing it on your machine would add their certificates as trusted. Onavo would then intercept all communication to Snapchat and pretend the connection is TLS-secure by forging a Snapchat certificate and signing it with its own.

"Whenever someone asks a question about Snapchat, the answer is usually that because their traffic is encrypted, we have no analytics about them," Facebook CEO Mark Zuckerberg wrote in a 2016 email to Javier Olivan.

"Given how quickly they're growing, it seems important to figure out a new way to get reliable analytics about them," Zuckerberg continued. "Perhaps we need to do panels or write custom software. You should figure out how to do this."

Zuckerberg ordered his engineers to "think outside the box" to break TLS encryption in a way that would allow them to quietly sell data to advertisers.

I'm sure the brave programmers that came up with and implemented this nonsense were very proud of their service. Jesus fucking cinammon crunch Christ.

view more: next ›