this post was submitted on 05 May 2024
27 points (100.0% liked)
TechTakes
1427 readers
144 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
hmm. I can guess at a few reasons this could be happening: model coders "normalizing" everything to flat-ascii in training, or similar happening at training stage (because of the previously-referenced RLHF datamills employing only people with specific localized dialects, instead of wider context-local languages), etc.
wonder if this particular thing is a confluence of those, or just one specific set
have you ever met an English-native dev who didn't need to be trained out of the world being 7-bit ascii
@dgerard
7 bits were good enough for Jesus.
My Jesus wanted characters for drawing borders and playing card suits, which is why He handed down to us Code Page 437. Using the upper 128 characters for things like vowels with funny marks on them is catholic heresy (nuts to Latin 1, down with Unicode).