this post was submitted on 21 Jun 2023
18 points (100.0% liked)

Technology

37691 readers
316 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 1 year ago (1 children)

a "search engine" that hallucinates results, including but not limited to non-existent court cases.

[–] [email protected] 1 points 1 year ago (2 children)

And what? Half the shit on Google is completely wrong as well.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (1 children)

Google actually pulls results from web pages.

you know how some smartphone keyboards predict the next word that you're going to use, and you can form a comprehensible sentence that sometimes even makes sense by simply tapping the next word on the prediction bar over and over? that's what those language models do. they don't actually search for anything, they just create sequences of words that sound probable.

[–] [email protected] 1 points 1 year ago

It seems that Bing chat bot searches then reads the results and gives you the answer.

I know it's basically predictive text but if the prompt contains a relevant info then the predictive text is likely to be the answer you're looking for so it works well.

[–] [email protected] 3 points 1 year ago

Yeah but you can tell from the context that search results are just a list of random web pages that maybe what Google says is bollocks.

Google gives you a bunch of results and says "here, look at these". LLMs confidently tell you things that they may have simply made up and present them as if they're real.