this post was submitted on 28 Jun 2024
905 points (98.7% liked)
Science Memes
11047 readers
3311 users here now
Welcome to c/science_memes @ Mander.xyz!
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
- Don't throw mud. Behave like an intellectual and remember the human.
- Keep it rooted (on topic).
- No spam.
- Infographics welcome, get schooled.
This is a science community. We use the Dawkins definition of meme.
Research Committee
Other Mander Communities
Science and Research
Biology and Life Sciences
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- !reptiles and [email protected]
Physical Sciences
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
Humanities and Social Sciences
Practical and Applied Sciences
- !exercise-and [email protected]
- [email protected]
- !self [email protected]
- [email protected]
- [email protected]
- [email protected]
Memes
Miscellaneous
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The algorithm assigns weights to nodes in a neural network. These weights are derived by statistical association of tokens in the training data after they have been cleaned.
That is so enormously far from how we think humans learn (you don't teach a kid to understand theory of mind by plopping them in front of the Gutenberg project and saying good luck, and yet they learn to explain theory of mind problems all the same) that it is just comically farcial to assume something similar is happening underneath.
It is very interesting that llms are able to appear to be conversational but claiming they have some sort of mind with an understanding of maths is as ridiculous as suggesting a chess bot understands the Pauli exclusion principle because it never moves two pieces into the same physical space.