[-] [email protected] 25 points 3 months ago

Transcript:

20 years ago, I was advocating for JavaScript. My story was that JavaScript is a much better language than anybody knows and that if we use it properly we can do amazing things about it and it can change the world and in fact, that happened.

But now my evangel is that we should stop using JavaScript. That it has so many congenital defects it really is a smelly language. There's just a lot of crap in it.

And it's still maybe for its field of application the best language in the world for doing that kind of stuff but that's not good enough. We should be moving on to the next generation of languages.

It used to be that we'd get new computer languages about every generation. I started with Fortran and then C and C++ and Java and JavaScript and so on and then it kind of stopped. There are still people developing languages but nobody cares. One person can make a programming language, a really good one, but you can't get adoption for it.

There are lots of terrible mistakes in the way that the web works, in the way our operating systems work, and we can't get new ones. We're just stuck with this crap and they keep piling new features on everything and the new features always create new problems and it doesn't have to be like that. We could be using really clean operating systems with really clean languages and really clean runtimes and doing all this stuff in a much more reliable way. But we don't seem to want to do that.

I've done JavaScript for a generation. It's time for the next thing. And I don't think that should be considered a radical point of view. I think it should be a normal evolutionary view.

I bolded main points

[-] [email protected] 51 points 3 months ago* (last edited 3 months ago)

no no no, this is the wrong way around

because sales and marketing sell it before it even exists

18
submitted 3 months ago by [email protected] to c/[email protected]

Mapping C# array types to PostgreSQL array columns or other DBMS/DB JSON columns.

4
submitted 3 months ago by [email protected] to c/[email protected]

Available and enabled by default from version 17.11 Preview 2 onwards.

New resource explorer additionally supports search, single view across solution, edit multiple files and locales at once, dark mode, string.Format pattern validation, validation and warnings, combined string and media view, grid zoomability

[-] [email protected] 61 points 4 months ago
[-] [email protected] 29 points 4 months ago* (last edited 4 months ago)

I don't see how it solves the mentioned issues. Instead, federation introduces new issues of complexity, multi-layered moderation, and potential for distributed inefficiency, confusion, or more malicious attacks.

I think we can see on Lemmy some of the problems it introduces. But for an Encyclopedia, which is supposed to be a source of truth, I think it's much worse.

If you depend on instance admins as curators, it's not that different from Wikipedia roles, which at least has open governance and elections.

They say other projects didn't reach critical mass. I don't think spreading your contributors thin - even while connecting them to some dynamic degree - is how you reach critical mass.

[-] [email protected] 25 points 5 months ago

Commenter on Reddit (OP there) gives a talk link and summarization:

In the talk, Lars mentions that they often rely on self-reported anonymous data. But in this case, Google is large enough that teams have developed similar systems and/or literally re-written things, and so this claim comes from analyzing projects before and after these re-writes, so you’re comparing like teams and like projects. Timestamped: https://youtu.be/6mZRWFQRvmw?t=27012

Some additional context on these two specific claims:

Google found that porting Go to Rust "it takes about the same sized team about the same time to build it, so that's no loss of productivity" and "we do see some benefits from it, we see reduced memory usage [...] and we also see a decreased defect rate over time"

On re-writing C++ into Rust: "in every case, we've seen a decrease by more than 2x in the amount of effort required to both build the services written in Rust, as well as maintain and update those services. [...] C++ is very expensive for us to maintain."

[-] [email protected] 35 points 5 months ago

Is it because c++ devs need half their day for recovering from the trauma of reading and writing c++? /s

7
submitted 6 months ago by [email protected] to c/[email protected]

cross-posted from: https://programming.dev/post/11720354

UI Components: Smart Paste, Smart TextArea, Smart ComboBox

Dependency: Azure Cloud

They show an interesting new kind of interactivity. (Not that I, personally, would ever use Azure Cloud for that though.)

13
submitted 6 months ago by [email protected] to c/[email protected]

UI Components: Smart Paste, Smart TextArea, Smart ComboBox

Dependency: Azure Cloud

They show an interesting new kind of interactivity. (Not that I, personally, would ever use Azure Cloud for that though.)

18
submitted 6 months ago by [email protected] to c/[email protected]

Backwards compatibility is a key principle in .NET, and this means that packages targeting previous .NET versions, like ‘net6.0’ or ‘net7.0’, are also compatible with ‘net8.0’. […]

The new “Include compatible frameworks” option we added allows you to flip between filtering by explicit asset frameworks and the larger set of ‘compatible’ frameworks. Filtering by packages’ compatible frameworks now reveals a much larger set of packages for you to choose from.

11
submitted 6 months ago by [email protected] to c/[email protected]

Truly astonishing how much generalized modding seems to be possible through general DirectX (8/9) interfaces and official Nvidia provided tooling.

As an AMD graphics card user, it's very unfortunate that RTX/this functionality is proprietary/exclusive Nvidia. The tooling at least. The produced results supposedly should work on other graphics cards too (I didn't find official/upstream docs about it).

For more technical details of how it works, see the GameWorks wiki:

10
submitted 6 months ago by [email protected] to c/[email protected]

cross-posted from: https://programming.dev/post/11034601

There's a lot, and specifically a lot of machine learning talk and features in the 1.5 release of Opus - the free and open audio codec.

Audible and continuous (albeit jittery) talk on 90% packet loss is crazy.

Section WebRTC IntegrationSamples has an example where you can test out the 90 % packet loss audio.

112
submitted 6 months ago by [email protected] to c/[email protected]

There's a lot, and specifically a lot of machine learning talk and features in the 1.5 release of Opus - the free and open audio codec.

Audible and continuous (albeit jittery) talk on 90% packet loss is crazy.

Section WebRTC IntegrationSamples has an example where you can test out the 90 % packet loss audio.

4
submitted 6 months ago by [email protected] to c/[email protected]
[-] [email protected] 60 points 6 months ago

I scale by dropping requests

[-] [email protected] 24 points 6 months ago

I see, TIL. That's different from Germany, where Ingenieur is a protected term.

[-] [email protected] 23 points 6 months ago

Driving a train is engineering?

[-] [email protected] 74 points 6 months ago

Turned into a skeleton in 10 minutes

[-] [email protected] 43 points 6 months ago

The site name’s a play on “The Onion” so it’s gotta be satire, right? I couldn’t find an about page to confirm.

Yes, it's satire.

The page is run by one author https://www.theolognion.com/about and no description or goal described

Runs on "substack" platform (standard software)

The story reads like a story, and the mentioned company does not exist

[-] [email protected] 21 points 6 months ago

January 2023, Futurism brought widespread attention to the issue and discovered that the articles were full of plagiarism and mistakes. […] After the revelation, CNET management paused the experiment, but the reputational damage had already been done.

So the "AI experiment" is not active anymore. But the damage is already done.

It was also new to me that Wikipedia puts time-based reliability qualifiers on sources. It makes sense of course. And this example shows how a source can be good and reliable in the past, but not anymore - and differentiating that is important and necessary.

5
submitted 6 months ago by [email protected] to c/[email protected]

Describes considerations of convenience and security of auto-confirmation while entering a numeric PIN - which leads to information disclosure considerations.

An attacker can use this behavior to discover the length of the PIN: Try to sign in once with some initial guess like “all ones” and see how many ones can be entered before the system starts validating the PIN.

Is this a problem?

view more: ‹ prev next ›

Kissaki

joined 1 year ago