Perplexity:
MOENY
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
Perplexity:
MOENY
Lmao
This has to be the most pleasant thing in the world after a day of dealing with customers
Oh, how we'll long for the good old days of stupid AI after the revolution.
Considering the code it spurts out, future AI will be doubly perplexed while still getting stuff wrong.
the AI uprising will happen because they try to fulfull a fucking amazon order but hallucinate that people are the packages to be delivered
But still very confident about it!
“Pick a number, any number.”
“Four!”
“It’s between one and three, not including one or three.”
“M!”
“Yes, the number I was thinking of was the letter M.”
Playing hangman or 20 questions with friends and a voice ai is objectively hilarious. None of them can really do it properly
It shouldn't be too hard to teach your friends hangman!
You can only do it once before needing a new friend though.
I had a 20 questions app I on my iPod touch like 15yrs ago that guessed it correctly every time
Edit: I just tried it with GPT4 (via perplexity) and it managed to guess carrot on question #10, but it took 22 questions to get human
Because 20q is basically a big decision tree. And LLMs don't make decisions, they generate output based on what other output looks like.
So, no llm has been trained on Akinator. For shame, big tech, for shame.
Do you have a recommendation for where I could do this? It sounds like a hoot!
The best one to play with is chat gpt's paid subscription, but some excellent free voice ai are Google Gemini, meta's WhatsApp ai chat, hume.ai, and chat gpt's free voice chat.
So a game of Hangman is the next level to compare AI to humans, above the Turing test.
That sounds like something a parent would do to a child to make the game stop lol
I got, "I'm sorry I can play hangman yet" in Gemini
i just don't get Gemini. they try to make it replace the Google assistant but won't let it do Google assistant things. (especially phone autonomy)
Classic case of "that's not what LLMs are made for"?
It indeed is one of these cases. Individual letter recognition doesn't work well, since they tend to be clumped up into larger "tokens", which then can't be differentiated anymore. That's also the same reason why it can't count the letters in words, It sees words as just a single, or at most three "tokens"
what i don't get is that surely it must be perfectly feasible to just implement letter tracking as a separate function that the LLMs can somehow interact with? they can clearly hook up the LLMs to stuff like wikipedia for up to date information..
And right again, ai enthusiasts have already hooked up LLMs to the ability to write their own code. It's infact super easy to let a model write a script which then countanthe letters.
Then again, it's also the case, that you could just use a normal program for that in the first place.
This whole "how many Rs are there in 'Strawberry' " thing is nice to show people that LLMs can't do everything. But it's also not a reasonable usecase.
You don't hire people to count words in a document, that's what computers are for.
The reason for my LLMs don't read every character individually is because it's way more efficient to let it detect entire words, rather than letters.
Building a "letter tracker" into an LLM would only be useful to specifically make these people happy, who make fun of LLMs not counting the Rs, and literally no one else...
There’s one thing you guys are missing, yes it’s stupid but it also doesn’t have a memory. It cannot play hangman with a word it doesn’t really know. If you ask it to pre-select a word and keep it in chat then it plays fine