brie

joined 5 days ago
[–] [email protected] 1 points 2 days ago (1 children)

Large gains were due to scaling the hardware, and data. The training algorithms didn't change much, transformers allowed for higher parallelization. There are no signs of the process becoming self-improving. Agentic performance is horrible as you can see with Claude (15% of tasks successful).

What happens in the brain is a big mystery, and thus it cannot be mimicked. Biological neural networks do not exist, because the synaptic cleft is an artifact. The living neurons are round, and the axons are the result of dehydration with ethanol or xylene.

[–] [email protected] 5 points 2 days ago (16 children)

Not true. SMS is encrypted in 3G, LTE, 5G. Block cyphers like Kasumi and A/9 are used. SMS is reasonably secure, because it's hard to infiltrate telecom systems like S7

[–] [email protected] 2 points 2 days ago (3 children)

AGI or human level intelligence has a hardware problem. Fabs are not going to be autonomous within 20 years. Novel lithography and cleaning methods are difficult for large groups of humans. LLMs do not provide much assistance in semiconductor design. We are not even remotely close to manufacturing the infrastructure necessary to run human level intelligence software.

[–] [email protected] 1 points 2 days ago (5 children)

LLMs are not programmed in a traditional way. The actual code is quite small. It mostly runs backprop, filters the data. It is already easily generated by LLMs.

[–] [email protected] 1 points 3 days ago

Because writing web apps is boring as fuck, and evaluating switching provides a reason to stop coding in PHP, and write an article about how they still need to write PHP.

[–] [email protected] 1 points 3 days ago

Can you buy it?

[–] [email protected] 2 points 4 days ago

Broke back convolution

view more: ‹ prev next ›