this post was submitted on 23 Oct 2023
82 points (100.0% liked)

196

16484 readers
2402 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 

KICK TECH BROS OUT OF 196

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 1 year ago

"AI is anything that has not been done yet," round 153.

This is the year every art-gallery website had to develop a policy about generated images. In October of 2022 it would be insane to look at a mildly sloppy drawing and accuse someone of describing it into existence. Now the difference for any single image is mostly a vibe, and the biggest tell for generators is that their style shows too much variety. (Even if their content and composition super don't.)

This is the year we had to start worrying that disinformation campaigns even involve human beings. It no longer takes nation-state kinds of money to aim a firehose of manipulative bullshit at any subject you like. "Dead internet theory" became a real possibility, for anyone sticking with small communities.

AI most definitely exists, beyond most expectations from even a couple years ago. Is it AGI? Nope. Does it have qualia? Probably fuckin' not. Is it going to tell the truth and be perfectly logical and faithfully serve humanity? No - and I feel like that's where most people scoffing have lost the plot.

If you expected Asimov's three laws, you were not being realistic. If you expected flawless fact-checking, you probably missed how Good Old-Fashioned AI tried that several decades ago, and still peaked with getting questions wrong on Jeopardy. The AI we're doing now is a hot mess of results-based hallucination. It detects errors... statistically. Not versus any encyclopedia of facts. These mountain of linear algebra developed rules for spelling, grammar, and conjugation, by reading a whole book up to the Nth character, and guessing which letter comes next. Or looking at an image with garbage added and trying to spot which pixels are fucky.

If you think being a hot mess of results-based hallucination disqualifies any form of intelligence, do not look up how human memory works. Or do, and then ask yourself whether you saw that Shazam movie with Sinbad.

Within these systems are some things we would instruct old-fashioned AI to do, if we knew how. I before E except when nuh-uh. Body symmetry, material properties, how many fingers an average person has. These are jawdropping advances in things we've been trying to since serious computers used tape.

Even if the possibilities immediately stop - entire industries will be transformed, thanks to how much more any single human can do, thanks to this tech. You want an open-world game with a thousand voiced characters? Sure, no voice actors required. You might not even have to write all their lines. You want to write a comic with barely enough artistic skill to cut-and-paste poses from photographs? Your biggest hurdle will be standing out from the million other randos slapping together their own. And while video isn't feasible right this second, that's bound to change suddenly and soon. Boring people worry about footage of Gandhi shooting MLK. I'm gonna watch that live-action Shrek adaptation in the style of Labyrinth.

And every time these that's-not-AI advances emerge - we learn more about intelligence.

We keep saying 'it requires true consciousness to do [blank]' and then quietly mumbling 'nevermind' when a very dumb program manages the task. Any glance toward biology would show that complexity and even adaptation do not require any grand intellect. We're the idiot primates who have to figure everything out the hard way, and still struggle to explain it to one another. Your pets are intelligent. (Huskies aside.) Even if you can't hold a conversation with them. (Huskies aside.) They can't explain why they did a thing, or guarantee correct output, or stop puking on the goddamn rug instead of turning around to face the tile. They're still a pile of gates with an intrinsic shape, weighted by filtering immense quantities of sensory data. And so are you. We're just worse at picking shapes and filtering data for our simulations, versus what a billion years of fucking can do with a few pounds of meat.

But we're a lot better at it than we were last year.