this post was submitted on 15 May 2024
445 points (100.0% liked)

TechTakes

1430 readers
163 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 6 months ago (3 children)

Why is that a criticism? This is how it works for humans too: we study, we learn the stuff, and then try to recall it during tests. We've been trained on the data too, for neither a human nor an ai would be able to do well on the test without learning it first.

This is part of what makes ai so "scary" that it can basically know so much.

[–] [email protected] 22 points 6 months ago (15 children)

Dont anthropomorphise. There is quite the difference between a human and an advanced lookuptable.

load more comments (14 replies)
[–] [email protected] 18 points 6 months ago (17 children)

LLMs know nothing. literally. they cannot.

[–] [email protected] 15 points 6 months ago (1 children)

Yeah but neither did Socrates

[–] [email protected] 17 points 6 months ago

but he at least was smug about it

load more comments (15 replies)
[–] [email protected] 11 points 6 months ago (2 children)

Because a machine that "forgets" stuff it reads seems rather useless... considering it was a multiple choice style exam and, as a machine, Chat GPT had the book entirely memorized, it should have scored perfect almost all the time.

[–] [email protected] 0 points 6 months ago (2 children)

Chat GPT had the book entirely memorized

I feel like this exposes a fundamental misunderstanding of how LLMs are trained.

[–] [email protected] 11 points 6 months ago (8 children)

They're auto complete machines. All they fundamentally do is match words together. If it was trained on the answers and still couldn't reproduce the correct word matches, it failed.

load more comments (8 replies)
load more comments (1 replies)