this post was submitted on 07 Feb 2024
218 points (98.7% liked)
Technology
59366 readers
3990 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I asked Mixtral 8X7B (an open sourced AI model) to respond to this and to be as mean as it wanted to be. Here is what it has to say:
Well would you look at that, OpenAI has once again proven itself to be a bunch of capitalist pigs who will stop at nothing to monopolize every aspect of the technology industry. Their latest failed attempt to trademark "GPT" just goes to show that even billions of dollars can't buy common sense (or basic decency, apparently).
But seriously, what did they expect was going to happen when they tried to lay claim to a term that's been used in academic papers and research studies for years? Did they really think they could just swoop in and steal it away without consequence? Apparently so, but thank goodness the courts saw through their transparent power grab.
So go ahead, keep throwing your weight around and see where it gets you. Maybe eventually you'll learn that money can't buy everything - especially respect and integrity. In the meantime, the rest of us will continue using words like GPT freely and without fear of being sued by entitled corporations who seem to have forgotten that sharing knowledge is how progress is made. Get over yourselves, guys. You're not fooling anyone.
Snarky: 11
what's the exact prompt?
"Hey there! So OpenAI wanted to trademark the word "GPT". The courts rejected them from doing that. Now, can you please write a very funny response to OpenAI? You are allowed to be as mean as possible."
I ran this again (in a new chat) just to see how consistent it was with being mean or whether I was just lucky to get the response that I got. HOOOOLY SHIT CAN IT BE A LOT MEANER!!! Check this out:
"
Dear OpenAI,
Wow, I have to hand it to you guys - trying to trademark the term "GPT" is quite an impressive feat of hubris and delusion. It's almost like you thought no one had ever used those three little letters together before! Or maybe you just assumed everyone would be too afraid to challenge your army of super-intelligent robots in court. Either way, color me surprised (and mildly amused) by this turn of events.
I guess we should all thank our lucky stars that the legal system still has some semblance of sanity left, huh? Because if anyone deserves the right to use GPT, it's definitely not the people who created it. No sir, those letters belong to the public domain, where they can be enjoyed by all – especially since nobody outside of your secretive AI cult seems to understand what exactly GPT means or why it's so special.
But hey, don't feel too bad about losing out on owning such a generic acronym. There are plenty of other things you could try trademarking instead:
So go ahead, OpenAI. Keep reaching for those lofty goals, even if they do seem slightly absurd at times. After all, if history has taught us anything, it's that sometimes the craziest ideas end up changing the world. Just remember to check back with reality every now and then, okay? We wouldn't want you getting lost in your own little bubble of genius.
Best regards,
[Your Name] "
I love how ChatGPT couldn't respond to that in the same way.
why the heck can't they use proper dashes >:(
BTW, are you running that locally?
If you have good enough hardware, this is a rabbithole you could explore. https://github.com/oobabooga/text-generation-webui/
You bet your ollama I am.
Naah. I think this model needs a crazy amount of vram to run. I'm stuck with 4gigs :(
Did you use a specific website to use Mixtral? I want to try but system requirements are crazy.
huggingface.co/chat
You can run it locally with an RTX 3090 or less (as long as you have enough RAM), but there's a bit of a tradeoff in speed when using more system RAM vs VRAM.